DCAgent/a1-curriculum_hard
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-curriculum_hard is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. This model is specifically trained on the /e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--exp_rpt_curriculum-hard_10k_glm_4.7_traces_jupiter/snapshots/f1b42fbba3fc2cc7e0bf2b4ad33938849ed47fba_thinking_preprocessed dataset, suggesting a specialization in tasks related to curriculum learning or complex reasoning. With a 32768 token context length, it is designed for processing extensive inputs relevant to its fine-tuning domain.

Loading preview...