arcee-ai/Meraj-Mini
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Oct 6, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Arcee Meraj Mini is an open-source, 7 billion parameter instruction-tuned causal language model developed by arcee-ai, fine-tuned from Qwen2.5-7B-Instruct. It is meticulously designed for strong performance in both Arabic and English, excelling in Arabic language understanding and cultural adaptation while maintaining competitive English capabilities. This model is optimized for bilingual tasks, including content creation, customer service, education, mathematics, and coding in Arabic contexts.

Loading preview...