ali-elganzory/Qwen2.5-1.5B-SFT-Tulu3-decontaminated
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 17, 2026Architecture:Transformer Warm

ali-elganzory/Qwen2.5-1.5B-SFT-Tulu3-decontaminated is a 1.5 billion parameter language model, fine-tuned from Qwen/Qwen2.5-1.5B using the TRL framework. This model has undergone Supervised Fine-Tuning (SFT) to enhance its instruction-following capabilities. With a substantial 131,072 token context length, it is designed for general text generation and conversational AI tasks, particularly where a smaller, efficient model with strong instruction adherence is beneficial.

Loading preview...