longdev37/qwen3-4b-hospital-tth-merged

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The longdev37/qwen3-4b-hospital-tth-merged is a 4 billion parameter Qwen3-based instruction-tuned language model developed by longdev37. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general instruction-following tasks, leveraging its Qwen3 architecture and efficient fine-tuning process.

Loading preview...

Model Overview

The longdev37/qwen3-4b-hospital-tth-merged is a 4 billion parameter instruction-tuned language model based on the Qwen3 architecture. Developed by longdev37, this model was fine-tuned from unsloth/qwen3-4b-instruct-2507-unsloth-bnb-4bit.

Key Characteristics

  • Architecture: Qwen3-based, a powerful transformer architecture.
  • Parameter Count: 4 billion parameters, offering a balance between performance and computational efficiency.
  • Training Efficiency: Fine-tuned using Unsloth and Huggingface's TRL library, resulting in a 2x faster training process compared to standard methods.
  • Context Length: Supports a substantial context window of 32768 tokens, allowing for processing longer inputs and generating more coherent, extended responses.

Potential Use Cases

  • Instruction Following: Excels at understanding and executing a wide range of instructions.
  • General Text Generation: Suitable for various natural language generation tasks.
  • Applications requiring efficient deployment: Its 4B parameter size and optimized training suggest it could be a good candidate for applications where resource efficiency is important.