DCAgent/a1-code_feedback
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-code_feedback is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model is specifically trained on the neulab-code-feedback-sandboxes_glm_4.7_traces_jupiter dataset, indicating an optimization for processing and generating code-related feedback. It features a 32768 token context length, making it suitable for handling extensive code snippets and associated discussions.

Loading preview...