XXsongLALA/Qwen-2.5-7B-base-RAG-RL
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 28, 2025Architecture:Transformer0.0K Warm

The XXsongLALA/Qwen-2.5-7B-base-RAG-RL is a 7.6 billion parameter base model from the Qwen 2.5 family, featuring a substantial 131,072 token context length. This model was trained from scratch, though specific dataset details are not provided. It is designed as a foundational language model, suitable for further fine-tuning or applications requiring a large context window.

Loading preview...