shibi76/kural-mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 1, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The shibi76/kural-mistral-7b is a 7 billion parameter instruction-tuned causal language model developed by shibi76. It is finetuned from unsloth/mistral-7b-instruct-v0.3-bnb-4bit and was trained using Unsloth and Huggingface's TRL library for accelerated finetuning. This model is designed for general language generation tasks, leveraging the Mistral architecture's efficiency.
Loading preview...