Zheng-Zong/AronaR1-DS-7B-v2-epoch_2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 24, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
AronaR1-DS-7B-v2-epoch_2 is a 7.6 billion parameter Qwen2-based causal language model developed by Zheng-Zong. It was finetuned from unsloth/DeepSeek-R1-Distill-Qwen-7B and optimized for training speed using Unsloth and Huggingface's TRL library. This model supports a context length of 32768 tokens and is designed for general language generation tasks, leveraging its efficient training methodology.
Loading preview...