manhcuong2005/qwen2.5-1.5b-legal-edu-v4

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The manhcuong2005/qwen2.5-1.5b-legal-edu-v4 is a 1.5 billion parameter Qwen2.5 model developed by manhcuong2005, fine-tuned from unsloth/qwen2.5-1.5b-instruct-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for applications requiring a compact yet efficient language model, particularly benefiting from its optimized training process.

Loading preview...

Model Overview

The manhcuong2005/qwen2.5-1.5b-legal-edu-v4 is a 1.5 billion parameter Qwen2.5 model, developed by manhcuong2005. It is fine-tuned from the unsloth/qwen2.5-1.5b-instruct-unsloth-bnb-4bit base model, leveraging the Unsloth library and Huggingface's TRL for training. This approach enabled the model to be trained significantly faster, specifically 2x quicker than standard methods.

Key Characteristics

  • Architecture: Qwen2.5, a causal language model.
  • Parameter Count: 1.5 billion parameters, offering a balance between performance and computational efficiency.
  • Training Optimization: Utilizes Unsloth and Huggingface's TRL library for accelerated training, resulting in 2x faster fine-tuning.
  • License: Distributed under the Apache-2.0 license.

Use Cases

This model is suitable for applications where a compact and efficiently trained language model is required. Its optimized training process makes it a good candidate for developers looking for a performant 1.5B parameter model without extensive training times. While specific domain applications are not detailed, its base as an instruction-tuned model suggests general-purpose language understanding and generation tasks.