Vikhrmodels/it-5.3-fp16-32k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Jun 4, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Vikhrmodels/it-5.3-fp16-32k is an 8 billion parameter instruction-tuned large language model developed by Aleksandr Nikolich, Konstantin Korolev, and Artem Shelmanov. This model features an extended context length of 32,000 tokens, enabled by RoPE, and is specifically optimized for stable JSON output and multiturn conversations. It is designed to perform reliably on long context and complex prompts, particularly in Russian.

Loading preview...