HelpingAI/Dhanishtha-2.0-preview-0825
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Jul 29, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Dhanishtha-2.0-preview-0825 is a 14 billion parameter causal language model developed by HelpingAI, built upon the Qwen3-14B foundation. It is the world's first model to feature Intermediate Thinking capabilities, allowing it to pause, reflect, and self-correct its reasoning multiple times within a single response. With a 32K token context length and support for over 39 languages, this model excels at complex problem-solving, educational assistance, and research support requiring transparent, iterative reasoning.

Loading preview...