Sangsang/CI-7B-Feedback-merged
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 4, 2026Architecture:Transformer Cold

Sangsang/CI-7B-Feedback-merged is a 7.6 billion parameter language model developed by Sangsang. This model is presented as a merged version, indicating potential integration of various fine-tuning or feedback mechanisms. Its specific architecture, training data, and primary differentiators are not detailed in the provided information, suggesting it may be a base or experimental model requiring further context for specific applications.

Loading preview...