DCAgent/b1_top16_seq
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 7, 2026License:otherArchitecture:Transformer Cold

DCAgent/b1_top16_seq is an 8 billion parameter causal language model fine-tuned from Qwen/Qwen3-8B. This model was trained on a specific dataset, /scratch/08134/negin/hub/datasets--DCAgent--b1_top16_seq, suggesting specialization for tasks related to that dataset's domain. With a context length of 32768 tokens, it is designed for applications requiring processing of extensive input sequences.

Loading preview...