raalr/qwen2.5-1.5b-arabic-sft-3epoch
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 3, 2026Architecture:Transformer Loading

The raalr/qwen2.5-1.5b-arabic-sft-3epoch is a 1.5 billion parameter language model, fine-tuned for Arabic language tasks. This model is based on the Qwen2.5 architecture and has undergone 3 epochs of supervised fine-tuning (SFT). It is designed for general Arabic natural language processing applications, leveraging its compact size for efficient deployment.

Loading preview...