ogulcanaydogan/Turkish-LLM-14B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Mar 4, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Turkish-LLM-14B-Instruct by ogulcanaydogan is a 14 billion parameter instruction-tuned language model, fine-tuned from Qwen2.5-14B-Instruct using QLoRA on 242K Turkish instruction examples. This model is specifically optimized for Turkish language tasks, demonstrating a performance improvement of +0.30 on MMLU-TR compared to its base model. It is designed for applications requiring strong Turkish language understanding and generation capabilities.

Loading preview...