ZigZeug/Baatukaay-Qwen2.5-3B-Wolof
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 19, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The ZigZeug/Baatukaay-Qwen2.5-3B-Wolof is a 3.1 billion parameter causal language model developed by ZigZeug, fine-tuned from unsloth/qwen2.5-3b-instruct-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is specifically optimized for tasks requiring a compact yet capable model, leveraging its Qwen2.5 architecture and a 32768 token context length.

Loading preview...