7yskwon/toolcalling-merged-demo

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The 7yskwon/toolcalling-merged-demo is a 2 billion parameter Qwen3-based causal language model developed by 7yskwon, fine-tuned from unsloth/Qwen3-1.7B-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling faster fine-tuning. With a 32768 token context length, it is optimized for tool-calling applications.

Loading preview...

Model Overview

The 7yskwon/toolcalling-merged-demo is a 2 billion parameter Qwen3-based language model, fine-tuned by 7yskwon. It was developed from the unsloth/Qwen3-1.7B-unsloth-bnb-4bit base model.

Key Characteristics

  • Architecture: Qwen3-based, a causal language model.
  • Parameter Count: 2 billion parameters.
  • Context Length: Supports a substantial context window of 32768 tokens.
  • Training Efficiency: Fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.

Primary Use Case

This model is specifically designed and optimized for tool-calling applications, leveraging its fine-tuned capabilities to effectively interpret and execute tool-related instructions within its large context window.