akshayballal/Qwen2.5-1.5B-Instruct-SFT-MedQA-merged
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 8, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The akshayballal/Qwen2.5-1.5B-Instruct-SFT-MedQA-merged model is a 1.5 billion parameter instruction-tuned Qwen2.5 variant, developed by akshayballal. It was fine-tuned using Unsloth and Huggingface's TRL library, resulting in faster training. This model is specifically optimized for instruction-following tasks, leveraging its Qwen2.5 architecture for efficient performance.
Loading preview...