iuriesula99/Haiduk-27B
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Dec 18, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Haiduk-27B is a 27 billion parameter instruction-tuned causal language model developed by iuriesula99, based on Google's Gemma-3-27b-it and TheDrummer/Big-Tiger-Gemma-27B-v3. This model supports a 32768-token context length and is trained on a diverse set of datasets including iuriesula99/TigerPhase1, iuriesula99/TigerDataset2, iuriesula99/TigerPhase3, and iuriesula99/TigerPhase4. It is designed for general language tasks with a focus on English, Romanian, and Russian languages.

Loading preview...