DCAgent/c1_kimi_k2.5_fixed
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:otherArchitecture:Transformer Cold

DCAgent/c1_kimi_k2.5_fixed is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model was trained on a specific dataset, /e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--c1_kimi_k2.5_fixed/snapshots/5807137b49d0d1d27e7b100da3e8d4156ddb94e3_thinking_preprocessed, suggesting a specialization in processing or generating content related to "thinking" or internal monologue. With a 32K context length, it is suitable for tasks requiring extended conversational or document understanding.

Loading preview...