DCAgent/a1-stack_pytest
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-stack_pytest is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model was trained on a specialized dataset focused on `exp_rpt_stack-pytest-large_10k_glm_4.7_traces_jupiter` data, suggesting an optimization for specific testing or reporting tasks. Its fine-tuning process indicates a focus on specialized applications rather than general-purpose language generation.

Loading preview...