OiTe/MoR-M1-Qwen2.5-0.6a-0.4f
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 14, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

The MoR-M1-Qwen2.5-0.6a-0.4F is a 387-million parameter causal language model developed by OIT Technologies' Oit Lab, built on the Qwen2.5 transformer architecture. It is specifically optimized for Moroccan Arabic (Darija) using a custom BPE tokenizer, enabling efficient processing and long-context modeling for this low-resource language. This compact model is designed for research, fine-tuning, and deployment in environments with constrained compute, excelling in tasks requiring deep understanding of Darija.

Loading preview...