dicta-il/DictaLM-3.0-1.7B-Thinking
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Dec 1, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

DictaLM-3.0-1.7B-Thinking is a 1.7 billion parameter reasoning chat model developed by Dicta. Based on Qwen3-1.7B-Base, it is specifically designed for Hebrew language processing and features a unique 'thinking block' mechanism to plan responses before generation. This model excels in conversational AI, particularly for use cases requiring structured reasoning and tool-calling capabilities.

Loading preview...