mwitiderrick/SwahiliInstruct-v0.2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

mwitiderrick/SwahiliInstruct-v0.2 is a 7 billion parameter instruction-tuned causal language model based on the Mistral-7B-Instruct-v0.2 architecture. Developed by mwitiderrick, this model is specifically fine-tuned on the Swahili Alpaca dataset, making it highly proficient in generating responses in Swahili. It features a 4096-token context length and is optimized for Swahili language understanding and generation tasks.

Loading preview...