raalr/qwen2.5-1.5b-arabic-sft-3epoch

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 3, 2026Architecture:Transformer Cold

The raalr/qwen2.5-1.5b-arabic-sft-3epoch is a 1.5 billion parameter language model, fine-tuned for Arabic language tasks. This model is based on the Qwen2.5 architecture and has undergone 3 epochs of supervised fine-tuning (SFT). It is designed for general Arabic natural language processing applications, leveraging its compact size for efficient deployment.

Loading preview...

Model Overview

The raalr/qwen2.5-1.5b-arabic-sft-3epoch is a 1.5 billion parameter language model built upon the Qwen2.5 architecture. This model has been specifically fine-tuned for Arabic language understanding and generation, undergoing 3 epochs of supervised fine-tuning (SFT).

Key Characteristics

  • Architecture: Qwen2.5 base model.
  • Parameter Count: 1.5 billion parameters, offering a balance between performance and computational efficiency.
  • Language Focus: Primarily designed and fine-tuned for Arabic language tasks.
  • Training: Supervised fine-tuning (SFT) over 3 epochs to enhance its capabilities in Arabic contexts.
  • Context Length: Supports a context length of 32768 tokens.

Intended Use Cases

This model is suitable for a variety of Arabic NLP applications where a smaller, efficient model is preferred. Potential use cases include:

  • Arabic text generation.
  • Arabic language understanding tasks.
  • Integration into applications requiring Arabic language processing with moderate resource requirements.