Zheng-Zong/AronaR1-DS-7B-v2-epoch_5
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 24, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Zheng-Zong/AronaR1-DS-7B-v2-epoch_5 is a 7.6 billion parameter Qwen2-based causal language model developed by Zheng-Zong, fine-tuned from unsloth/DeepSeek-R1-Distill-Qwen-7B. This model was trained using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging its 32768 token context length for comprehensive understanding.

Loading preview...