DCAgent/a1-manybugs
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-manybugs is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model is specifically trained on the 'exp_rpt_manybugs-v2_10k_glm_4.7_traces_jupiter_thinking_preprocessed' dataset, suggesting an optimization for tasks related to bug reporting, analysis, or debugging processes. With a context length of 32768 tokens, it is designed to handle extensive input for specialized applications.

Loading preview...