harithoppil/Qwen3-0.6B-English
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Feb 16, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

harithoppil/Qwen3-0.6B-English is a specialized 0.5 billion parameter causal language model, derived from Qwen/Qwen3-0.6B, with a context length of 32,768 tokens. This version has been vocabulary-pruned to focus exclusively on English text, programming code, mathematics (LaTeX), and logical reasoning by removing non-English tokens. It features Qwen3's unique ability to seamlessly switch between 'thinking' and 'non-thinking' modes, optimizing performance for complex logical tasks and general dialogue respectively.

Loading preview...