davidkim205/Hunminai-1.0-12b
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jul 17, 2025Architecture:Transformer Cold

Hunminai-1.0-12b by davidkim205 is a 12 billion parameter Korean-aligned language model based on Google's Gemma-3 architecture, fine-tuned with Supervised Fine-Tuning (SFT) and Direct Preference Optimization (DPO) on 100k Korean instruction examples. It is optimized for Korean natural language tasks, excelling in dialogue generation, question answering, and long-form text generation. The model demonstrates strong performance across various Korean benchmarks, including an average score of 7.80, making it suitable for applications requiring high-quality Korean language understanding and generation.

Loading preview...