min1122/toolcalling-merged-demo

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

min1122/toolcalling-merged-demo is a 2 billion parameter Qwen3 model, developed by min1122, fine-tuned from unsloth/Qwen3-1.7B-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling faster fine-tuning. It is designed for general language tasks, leveraging its Qwen3 architecture and 32768 token context length.

Loading preview...

Model Overview

min1122/toolcalling-merged-demo is a 2 billion parameter language model based on the Qwen3 architecture. It was developed by min1122 and fine-tuned from the unsloth/Qwen3-1.7B-unsloth-bnb-4bit base model. The fine-tuning process utilized Unsloth and Huggingface's TRL library, which facilitated a significantly faster training time.

Key Characteristics

  • Architecture: Qwen3
  • Parameters: 2 billion
  • Context Length: 32768 tokens
  • Training Method: Fine-tuned with Unsloth and Huggingface TRL for accelerated training.

Potential Use Cases

This model is suitable for various natural language processing tasks, benefiting from its Qwen3 foundation and efficient fine-tuning. Its 32768 token context length allows for processing longer inputs and generating more coherent and extended responses.