Defetya/gemma-2b-ru
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kLicense:apache-2.0Architecture:Transformer Open Weights Warm

Defetya/gemma-2b-ru is a 2.6 billion parameter Gemma-based language model, pre-trained by Defetya, specifically optimized for Russian language fluency. It underwent a second stage of pre-training on 150 billion tokens from English and Russian Oscar and Wiki datasets. This foundational model is designed for further fine-tuning, aiming to enhance cross-linguistic capabilities and serve as a strong open-source Russian LLM.

Loading preview...