Vikhrmodels/Vikhr-YandexGPT-5-Lite-8B-it
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Feb 28, 2025License:yandexgpt-5-lite-8b-pretrainArchitecture:Transformer0.0K Cold

Vikhrmodels/Vikhr-YandexGPT-5-Lite-8B-it is an 8 billion parameter instruction-tuned language model based on YandexGPT-5-Lite-8B-pretrain, developed by Vikhrmodels. It is specialized for Russian language tasks, trained on the GrandMaster-PRO-MAX and Grounded-RAG-RU-v2 datasets using Supervised Fine-Tuning (SFT), and supports bilingual RU/EN interactions. This model excels in instruction following and RAG-grounded responses, particularly for Russian content.

Loading preview...