MyeongHo0621/Qwen2.5-3B-Korean
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Nov 22, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

MyeongHo0621/Qwen2.5-3B-Korean is a 3.1 billion parameter language model, fine-tuned from Qwen/Qwen2.5-3B-Instruct by MyeongHo Shin. Optimized specifically for Korean language tasks, it was trained on 200,000 high-quality Korean conversational data samples. This model is designed for general conversational AI, instruction following, and knowledge-based Q&A in Korean, offering immediate usability with its merged LoRA adapter and various GGUF formats.

Loading preview...