DCAgent/c1_gpt53_codex_fixed
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:otherArchitecture:Transformer Cold

DCAgent/c1_gpt53_codex_fixed is an 8 billion parameter causal language model, fine-tuned from Qwen/Qwen3-8B. This model is specifically adapted for tasks related to the /e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--c1_gpt53_codex_fixed/snapshots/38a6f93a475416e79a04e373ed2b1ff2d1d7c45a_thinking_preprocessed dataset, suggesting a specialization in areas covered by its training data. With a context length of 32768 tokens, it is suitable for processing extensive inputs relevant to its fine-tuning domain.

Loading preview...