XformAI-india/qwen-0.6b-reasoning
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:May 3, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

XformAI-india/qwen-0.6b-reasoning is a compact 0.8 billion parameter transformer model, fine-tuned by XformAI for enhanced reasoning, logic, and analytical thinking. Based on the Qwen 0.6B architecture, it excels at riddles, math word problems, and symbolic reasoning, demonstrating performance comparable to larger 7B models on specific reasoning benchmarks. Its primary use case is enabling complex problem-solving and step-by-step explanations on resource-constrained environments like CPUs or mobile edge devices, leveraging its lightweight design and 32768 token context length.

Loading preview...