winglian/qwen3-1.7b-math-sft
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Jun 4, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
The winglian/qwen3-1.7b-math-sft model is a 2 billion parameter language model, fine-tuned from Qwen/Qwen3-1.7B-Base. It is specifically optimized for mathematical reasoning tasks through supervised fine-tuning on the winglian/OpenThoughts-114k-math-correct dataset. This model is designed to enhance performance in solving mathematical problems and related analytical challenges.
Loading preview...