Cartik/BastiAI-1.1-Instruct

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 30, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Cartik/BastiAI-1.1-Instruct is a 1.5 billion parameter instruction-tuned causal language model developed by Cartik, based on Qwen/Qwen2.5-1.5B-Instruct. This model is specifically fine-tuned for both Russian and English languages, offering a 32768-token context length. It is designed for general instruction-following tasks in a bilingual context.

Loading preview...

Overview

Cartik/BastiAI-1.1-Instruct is a compact yet capable instruction-tuned language model, built upon the Qwen2.5-1.5B-Instruct architecture. Developed by Cartik, this model features 1.5 billion parameters and supports an extensive context length of 32768 tokens, making it suitable for processing longer inputs and generating coherent, extended responses.

Key Capabilities

  • Bilingual Support: Optimized for instruction-following tasks in both Russian (ru) and English (en).
  • Instruction Tuning: Designed to accurately interpret and execute user instructions, providing relevant and helpful outputs.
  • Extended Context Window: A 32768-token context length allows for handling complex queries, detailed conversations, and longer documents.

Good For

  • Applications requiring efficient instruction-following in Russian and English.
  • Use cases where a smaller model size is preferred without significantly compromising on context handling.
  • Developing chatbots or virtual assistants that need to operate in a bilingual environment.