longtermrisk/Qwen3-4B-ftjob-5d8108edb49a

TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Apr 14, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The longtermrisk/Qwen3-4B-ftjob-5d8108edb49a is a 4 billion parameter Qwen3 model developed by longtermrisk, fine-tuned from unsloth/Qwen3-4B. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...

Model Overview

This model, longtermrisk/Qwen3-4B-ftjob-5d8108edb49a, is a 4 billion parameter Qwen3-based language model developed by longtermrisk. It has been fine-tuned from the unsloth/Qwen3-4B base model.

Key Characteristics

  • Efficient Training: The model was trained using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process compared to standard methods.
  • Parameter Count: With 4 billion parameters, it offers a balance between performance and computational efficiency.
  • Context Length: The model supports a context length of 32768 tokens, allowing for processing longer inputs.

Use Cases

This model is suitable for a variety of general language understanding and generation tasks, benefiting from its efficient fine-tuning process and moderate parameter count. Its Apache-2.0 license allows for broad usage.