LEEDAEWON/qwen25_1_5b_korean_unsloth
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

LEEDAEWON/qwen25_1_5b_korean_unsloth is a 1.5 billion parameter Qwen2.5 model developed by LEEDAEWON, fine-tuned from unsloth/Qwen2.5-1.5B-bnb-4bit. This model was trained 2x faster using Unsloth and Huggingface's TRL library, making it efficient for specific applications. It is designed for tasks requiring a compact yet capable language model, potentially optimized for Korean language processing given its developer's name and the base model's capabilities.

Loading preview...