msu-rcc-lair/RuadaptQwen2.5-32B-Instruct
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Nov 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The msu-rcc-lair/RuadaptQwen2.5-32B-Instruct is a 32.8 billion parameter instruction-tuned language model based on the Qwen2.5 architecture, developed by msu-rcc-lair. It features a replaced tokenizer and continued pretraining on a Russian corpus, followed by Learned Embedding Propagation (LEP). This adaptation significantly increases Russian text generation speed by up to 60% compared to the original Qwen-2.5-32B-Instruct, making it optimized for Russian language tasks.

Loading preview...