raalr/qwen2.5-1.5b-arabic-sft-1epoch
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 3, 2026Architecture:Transformer Loading

The raalr/qwen2.5-1.5b-arabic-sft-1epoch is a 1.5 billion parameter language model, fine-tuned for Arabic language tasks. This model is based on the Qwen2.5 architecture and has undergone one epoch of supervised fine-tuning (SFT). With a context length of 32768 tokens, it is designed for applications requiring robust Arabic language understanding and generation.

Loading preview...