DCAgent/b1_top4
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 6, 2026License:otherArchitecture:Transformer Cold

DCAgent/b1_top4 is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model was specifically trained on the /e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--b1_top4/snapshots/db1eb508ebd868241a13f4083e7939710048d63c_thinking_preprocessed dataset. It utilizes a 32768 token context length and is optimized for tasks related to its specific fine-tuning data.

Loading preview...