DCAgent/a1-qasper
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 1, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-qasper is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B by DCAgent. This model is specifically trained on the qasper-sandboxes_glm_4.7_traces_jupiter dataset, indicating an optimization for tasks related to question answering over scientific papers or similar structured data. It leverages a 32768 token context length, making it suitable for processing extensive documents and complex queries.

Loading preview...