Skarmorie/Mag-Mell-RU-035
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 3, 2025Architecture:Transformer0.0K Cold
Skarmorie/Mag-Mell-RU-035 is a 12 billion parameter language model, merged using the Linear DARE method, with inflatebot/MN-12B-Mag-Mell-R1 as its base. This model integrates Aleteian/base-ground-2 to enhance its stability and output quality specifically for the Russian language. It is optimized for generating more consistent and reliable text in Russian, making it suitable for applications requiring robust multilingual capabilities.
Loading preview...
Skarmorie/Mag-Mell-RU-035: Enhanced Russian Language Model
This model, Skarmorie/Mag-Mell-RU-035, is a 12 billion parameter language model created through a sophisticated merge process. It leverages the Linear DARE merge method, building upon the robust foundation of inflatebot/MN-12B-Mag-Mell-R1.
Key Capabilities
- Improved Russian Output: The primary differentiator of this model is its enhanced stability and quality for Russian language generation. This improvement stems from the integration of
Aleteian/base-ground-2, which itself is a merge of Saiga and Vikhr models, known for their Russian language proficiency. - Merge Method: Utilizes the Linear DARE method, a technique designed for effective merging of pre-trained language models.
- Base Model:
inflatebot/MN-12B-Mag-Mell-R1serves as the foundational model, providing a strong general language understanding.
Good For
- Applications requiring stable and high-quality text generation in Russian.
- Developers looking for a 12B parameter model with a specific focus on multilingual performance, particularly for Russian.
- Use cases where the combined strengths of
Mag-Melland Russian-optimized models are beneficial.