ali-elganzory/Qwen3-1.7B-Base-SFT-Tulu3-decontaminated
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Jan 17, 2026Architecture:Transformer Warm
The ali-elganzory/Qwen3-1.7B-Base-SFT-Tulu3-decontaminated model is a fine-tuned version of the Qwen3-1.7B-Base architecture, developed by ali-elganzory. This 1.7 billion parameter model has been instruction fine-tuned using the TRL framework with Supervised Fine-Tuning (SFT). It is designed for general text generation tasks, leveraging its base Qwen3 capabilities enhanced by instruction following. This model is suitable for applications requiring a compact yet capable language model for conversational or prompt-based interactions.
Loading preview...