akshayballal/Qwen2.5-1.5B-Instruct-SFT-Pubmed-16bit-DFT
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 10, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

akshayballal/Qwen2.5-1.5B-Instruct-SFT-Pubmed-16bit-DFT is a 1.5 billion parameter Qwen2.5-based instruction-tuned language model developed by akshayballal. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is specifically optimized for tasks related to the Pubmed dataset, making it suitable for biomedical and scientific text processing. The model supports a substantial context length of 131072 tokens.

Loading preview...