movi3353/toolcalling-merged-demo

TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The movi3353/toolcalling-merged-demo is a 2 billion parameter Qwen3 model, developed by movi3353, with a 32768 token context length. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for applications requiring efficient language processing based on the Qwen3 architecture.

Loading preview...

Overview

The movi3353/toolcalling-merged-demo is a 2 billion parameter Qwen3 model, fine-tuned by movi3353. It leverages the Qwen3 architecture and was trained using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process. This model offers a substantial context length of 32768 tokens, making it suitable for tasks requiring extensive contextual understanding.

Key Characteristics

  • Base Model: Qwen3-1.7B-unsloth-bnb-4bit
  • Parameter Count: 2 billion parameters
  • Context Length: 32768 tokens
  • Training Efficiency: Fine-tuned with Unsloth for accelerated training.

Potential Use Cases

This model is well-suited for applications that benefit from the Qwen3 architecture and require efficient processing of long contexts. Its fine-tuned nature suggests potential for specialized language tasks where the base Qwen3 model's capabilities are enhanced.