AnatoliiPotapov/T-lite-instruct-0.1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Jul 16, 2024Architecture:Transformer0.1K Cold

AnatoliiPotapov/T-lite-instruct-0.1 is an 8 billion parameter instruction-tuned causal language model, developed by AnatoliiPotapov, based on the T-lite-0.1 architecture. Trained in bf16, it leverages a diverse dataset including translated open-source English datasets and synthetic grounded QA contexts. This model demonstrates strong performance in multilingual contexts, particularly Russian, outperforming several 8B-class models on Russian MT-Bench and Arena benchmarks.

Loading preview...