dicta-il/DictaLM-3.0-24B-Base
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Oct 23, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

DictaLM-3.0-24B-Base is a 24-billion-parameter base language model developed by Dicta, initialized from Mistral-Small-3.1-24B-Base-2503. This model is part of the Dicta-LM 3.0 collection, trained on extensive Hebrew and English corpora, and sets a new state-of-the-art for its weight class in Hebrew language processing. It is designed as a foundational model for further fine-tuning, particularly for applications requiring strong Hebrew language capabilities.

Loading preview...