RefalMachine/RuadaptQwen2.5-32B-Pro-Beta
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jan 19, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

RefalMachine/RuadaptQwen2.5-32B-Pro-Beta is a 32.8 billion parameter language model adapted from the T-pro-it-1.0 architecture by RefalMachine, specifically optimized for Russian language processing. It features a replaced tokenizer and continued pretraining on a Russian corpus, followed by Learned Embedding Propagation (LEP). This adaptation significantly increases Russian text generation speed by up to 60% compared to the original model, making it highly efficient for Russian-centric applications.

Loading preview...