yibba/Atlas-Empathy-Darija
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Jan 3, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The yibba/Atlas-Empathy-Darija is a 9 billion parameter instruction-tuned causal language model, finetuned from MBZUAI-Paris/Atlas-Chat-9B. Developed by yibba, this model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for applications requiring a 9B parameter model with a 16384 token context length, leveraging efficient training methodologies.

Loading preview...