DheepLearning/iflow-metadata-qwen3-4b-sft-128k

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Dec 26, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

DheepLearning/iflow-metadata-qwen3-4b-sft-128k is a 4 billion parameter Qwen3 model developed by DheepLearning. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language tasks, leveraging its Qwen3 architecture and efficient fine-tuning process.

Loading preview...

Model Overview

DheepLearning/iflow-metadata-qwen3-4b-sft-128k is a 4 billion parameter language model based on the Qwen3 architecture. Developed by DheepLearning, this model was fine-tuned from unsloth/qwen3-4b-unsloth-bnb-4bit.

Key Characteristics

  • Architecture: Qwen3, a causal language model.
  • Parameter Count: 4 billion parameters.
  • Training Efficiency: Fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
  • License: Released under the Apache-2.0 license.

Potential Use Cases

This model is suitable for a variety of general language understanding and generation tasks, benefiting from its efficient fine-tuning and the capabilities of the Qwen3 base model. Its optimized training process suggests it could be a good candidate for applications where rapid deployment of fine-tuned models is beneficial.