DCAgent/a1-synatra
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-synatra is an 8 billion parameter causal language model fine-tuned from Qwen/Qwen3-8B. This model is specifically optimized for processing and understanding traces from the neulab-synatra-sandboxes_glm_4.7_traces_jupiter dataset. It leverages a 32,768 token context length, making it suitable for tasks requiring extensive contextual understanding within its specialized domain.

Loading preview...