PrimeIntellect/Qwen3-1.7B-Wordle-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Sep 23, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

PrimeIntellect/Qwen3-1.7B-Wordle-SFT is a 1.7 billion parameter language model, fine-tuned from PrimeIntellect/Qwen3-1.7B, specifically optimized for playing the game Wordle. With a context length of 40960 tokens, this model is designed for specialized tasks requiring strategic word generation and pattern recognition within game-specific constraints. Its primary application is demonstrating supervised fine-tuning for game-playing AI, particularly for Wordle.

Loading preview...