Shahansha/Manthan-1.5B-sft
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Loading
Shahansha/Manthan-1.5B-sft is a 1.5 billion parameter instruction-tuned causal language model developed by Shahansha, finetuned from unsloth/Qwen2.5-1.5B-Instruct-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language understanding and generation tasks, leveraging its efficient training methodology.
Loading preview...