Soohyunai/qwen25_1_5b_korean_unsloth

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Soohyunai/qwen25_1_5b_korean_unsloth is a 1.5 billion parameter Qwen2.5 model developed by Soohyunai, fine-tuned using Unsloth and Huggingface's TRL library. This model was trained for enhanced speed, offering efficient performance for Korean language tasks. With a 32768 token context length, it is optimized for applications requiring fast processing of Korean text.

Loading preview...

Model Overview

Soohyunai/qwen25_1_5b_korean_unsloth is a 1.5 billion parameter Qwen2.5 model, developed by Soohyunai and licensed under Apache-2.0. This model was fine-tuned from unsloth/Qwen2.5-1.5B-bnb-4bit using the Unsloth library in conjunction with Huggingface's TRL library.

Key Characteristics

  • Architecture: Based on the Qwen2.5 family.
  • Parameter Count: 1.5 billion parameters.
  • Training Efficiency: Achieved 2x faster training speeds due to the utilization of Unsloth.
  • Context Length: Supports a context window of 32768 tokens.

Use Cases

This model is particularly well-suited for applications requiring efficient processing and generation of Korean language text. Its optimized training process suggests it can be deployed effectively in scenarios where rapid fine-tuning and inference are beneficial.