aimamba/latvian-english-qwen2.5-1.5b

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 16, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The aimamba/latvian-english-qwen2.5-1.5b is a 1.5 billion parameter Qwen2.5 model, developed by aimamba, specifically fine-tuned for Latvian and English language tasks. This model leverages the Qwen2.5 architecture and was trained using Unsloth and Huggingface's TRL library for accelerated performance. It is optimized for applications requiring efficient processing and generation in both Latvian and English.

Loading preview...

Model Overview

The aimamba/latvian-english-qwen2.5-1.5b is a specialized language model developed by aimamba, built upon the Qwen2.5 architecture. This model features 1.5 billion parameters and has been fine-tuned to excel in tasks involving both Latvian and English languages. Its development utilized Unsloth for faster training, specifically achieving a 2x speed improvement, in conjunction with Huggingface's TRL library.

Key Capabilities

  • Bilingual Proficiency: Optimized for understanding and generating text in both Latvian and English.
  • Efficient Training: Benefits from Unsloth's accelerated training methods, making it a resource-efficient option.
  • Qwen2.5 Architecture: Leverages the robust capabilities of the Qwen2.5 base model.

Good For

  • Applications requiring language processing in Latvian and English.
  • Developers seeking a compact yet capable bilingual model.
  • Use cases where training efficiency and performance are critical.