INSAIT-Institute/BgGPT-Gemma-3-27B-IT

VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:gemmaArchitecture:Transformer0.0K Cold

BgGPT-Gemma-3-27B-IT is a 27 billion parameter instruction-tuned large language model developed by INSAIT, based on the Gemma 3 architecture. This model features vision-language understanding, an effective context length of 131k tokens, and an updated knowledge cut-off to October 2025. It is specifically adapted for Bulgarian language tasks and excels in instruction-following and multi-turn conversations.

Loading preview...

BgGPT-Gemma-3-27B-IT Overview

BgGPT-Gemma-3-27B-IT is a 27 billion parameter instruction-tuned model from INSAIT's BgGPT 3.0 series, built upon the Gemma 3 architecture. This model is specifically adapted for the Bulgarian language and is available in various sizes, including 4B, 12B, and this 27B variant.

Key Capabilities & Improvements

  • Vision-Language Understanding: Processes both text and images within the same context, enabling multimodal interactions.
  • Enhanced Instruction-Following: Demonstrates improved performance on a wider array of tasks, including complex instructions, multi-turn conversations, and system prompts.
  • Extended Context Length: Features an effective context window of 131,000 tokens, facilitating longer and more intricate interactions.
  • Updated Knowledge Base: Incorporates pretraining data up to May 2025 and instruction fine-tuning data up to October 2025, ensuring a current knowledge cut-off.

Usage Considerations

This model supports integration with the Hugging Face Transformers library and vLLM for efficient inference. It also offers dynamic FP8 quantization with vLLM, providing approximately 2x memory reduction with minimal quality loss, suitable for GPUs with compute capability >= 8.9 (e.g., H100, RTX 4090).