dicta-il/DictaLM-3.0-1.7B-Thinking

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Dec 1, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

DictaLM-3.0-1.7B-Thinking is a 1.7 billion parameter reasoning chat model developed by Dicta. Based on Qwen3-1.7B-Base, it is specifically designed for Hebrew language processing and features a unique 'thinking block' mechanism to plan responses before generation. This model excels in conversational AI, particularly for use cases requiring structured reasoning and tool-calling capabilities.

Loading preview...

DictaLM-3.0-1.7B-Thinking: A Hebrew Reasoning Chat Model

DictaLM-3.0-1.7B-Thinking is part of the Dicta-LM 3.0 collection, an open-weight suite of large language models developed by Dicta. This specific model is a 1.7 billion parameter reasoning chat model, initialized from Qwen3-1.7B-Base, and is notable for its full precision (BF16).

Key Capabilities & Features

  • Reasoning Chat Model: Before generating a response, the model utilizes a designated 'thinking block' to plan its reply, enhancing the coherence and quality of its output.
  • Hebrew Language Focus: It sets a new state-of-the-art (SOTA) for its weight class in Hebrew language processing, both as a base and chat model.
  • Tool-Calling Support: The model supports tool-calling, enabling integration with external tools and APIs, as demonstrated with vLLM.
  • Extensive Training: Trained on comprehensive corpora of both Hebrew and English texts.

When to Use This Model

  • Hebrew-centric Applications: Ideal for chatbots, conversational AI, and other language generation tasks where strong Hebrew performance is critical.
  • Reasoning-intensive Tasks: Suitable for use cases that benefit from a model's ability to internally plan and structure its responses.
  • Integration with External Tools: Leverage its tool-calling capabilities for applications requiring interaction with APIs or other software components.