Zheng-Zong/AronaR1-DS-7B-v3-epoch_4
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The Zheng-Zong/AronaR1-DS-7B-v3-epoch_4 is a 7.6 billion parameter Qwen2-based causal language model, finetuned by Zheng-Zong from unsloth/DeepSeek-R1-Distill-Qwen-7B. This model was trained using Unsloth and Huggingface's TRL library, enabling faster finetuning. With a context length of 32768 tokens, it is designed for general language generation tasks.
Loading preview...