yaho2k/toolcalling-merged-demo

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The yaho2k/toolcalling-merged-demo is a 2 billion parameter Qwen3 model, developed by yaho2k, fine-tuned for tool-calling capabilities. This model was efficiently trained using Unsloth and Huggingface's TRL library, offering a 32768 token context length. It is optimized for applications requiring function calling and integration with external tools.

Loading preview...

Model Overview

The yaho2k/toolcalling-merged-demo is a 2 billion parameter Qwen3 model developed by yaho2k, specifically fine-tuned for enhanced tool-calling functionality. This model leverages a 32768 token context length, making it suitable for complex interactions requiring extensive context.

Key Capabilities

  • Tool Calling: Optimized to understand and execute function calls, enabling integration with external APIs and services.
  • Efficient Training: Fine-tuned using Unsloth and Huggingface's TRL library, resulting in 2x faster training compared to standard methods.
  • Qwen3 Architecture: Built upon the Qwen3 base, providing a robust foundation for language understanding and generation.

Good For

  • Developing agents that interact with external tools.
  • Applications requiring structured output for function execution.
  • Scenarios where efficient model deployment and performance are critical.