DCAgent/a1-pr_mining
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:otherArchitecture:Transformer Cold
DCAgent/a1-pr_mining is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model is specifically trained on the /e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--exp_rpt_pr_10k_glm_4.7_traces_jupiter/snapshots/2593d31f68aa08a582776112374e20bf323269c1_thinking_preprocessed dataset, suggesting a specialization in processing or generating content related to experimental reports or traces. With a context length of 32768 tokens, it is suitable for tasks requiring extensive contextual understanding.
Loading preview...