yibba/GemMaroc-Qwen2.5-7B-Instruct-Therapy-Darija
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Jan 5, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The yibba/GemMaroc-Qwen2.5-7B-Instruct-Therapy-Darija is a 7.6 billion parameter instruction-tuned causal language model, fine-tuned by yibba from the GemMaroc/Qwen2.5-7B-Instruct-darija base model. This model is specifically optimized for therapeutic applications in Darija, leveraging efficient training with Unsloth and Huggingface's TRL library. It offers a 32768 token context length, making it suitable for extended conversational interactions in Darija.
Loading preview...