Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 11, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

QVikhr-2.5-1.5B-Instruct-r is a 1.5 billion parameter instruction-tuned causal language model developed by Vikhrmodels, specialized for Russian language tasks. It is based on the QVikhr-2.5-1.5B-Instruct-r architecture and has been specifically trained using the RuMath dataset. This model supports bilingual (Russian/English) interactions and is optimized for mathematical reasoning in Russian.

Loading preview...