davidkim205/Hunminai-1.0-27b
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Jul 17, 2025Architecture:Transformer Cold

Hunminai-1.0-27b by davidkim205 is a 27 billion parameter Korean-aligned language model based on Google's Gemma-3 architecture, fine-tuned using Supervised Fine-Tuning (SFT) and Direct Preference Optimization (DPO). It is optimized for Korean natural language tasks, including dialogue generation, question answering, and long-form text generation. The model demonstrates improved performance across various Korean benchmarks, outperforming its base model and other Gemma-3 variants. It features a 128k context length and is designed to better align with user intents in Korean.

Loading preview...