jihyuny/toolcalling-merged-demo

TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The jihyuny/toolcalling-merged-demo is a 2 billion parameter Qwen3-based causal language model developed by jihyuny. This model is finetuned using Unsloth and Huggingface's TRL library, focusing on efficient training. It is designed for applications requiring a compact yet capable model, particularly for tool-calling functionalities.

Loading preview...

Model Overview

The jihyuny/toolcalling-merged-demo is a 2 billion parameter Qwen3-based language model, developed by jihyuny. It was finetuned from the unsloth/Qwen3-1.7B-unsloth-bnb-4bit base model, leveraging the Unsloth library and Huggingface's TRL for accelerated training. This approach allowed for a 2x faster finetuning process, optimizing for efficiency.

Key Capabilities

  • Efficient Training: Utilizes Unsloth for significantly faster finetuning.
  • Qwen3 Architecture: Based on the Qwen3 model family, providing a solid foundation for language understanding and generation.
  • Compact Size: With 2 billion parameters, it offers a balance between performance and computational resource requirements.

Good For

  • Tool-Calling Applications: The model's name suggests an optimization for tool-calling tasks, making it suitable for integrating external functions or APIs.
  • Resource-Constrained Environments: Its compact size and efficient training make it a good candidate for deployment where computational resources are limited.
  • Rapid Prototyping: The faster finetuning process enables quicker iteration and development cycles for specific use cases.