DCAgent/a1-codeforces
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-codeforces is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model is specifically trained on the DCAgent/codeforces-sandboxes-1_10k_glm_4.7_traces_jupiter dataset, indicating an optimization for tasks related to competitive programming or code generation within sandbox environments. It features a 32768 token context length, making it suitable for processing longer code snippets and problem descriptions.

Loading preview...