yjuchoi/toolcalling-merged-demo-v2
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The yjuchoi/toolcalling-merged-demo-v2 is a 2 billion parameter Qwen3-based causal language model, developed by yjuchoi and fine-tuned using Unsloth and Huggingface's TRL library. This model, with a 32768 token context length, is optimized for efficient training, achieving 2x faster finetuning. Its primary strength lies in its foundation on the Qwen3 architecture, enhanced for specific applications through its fine-tuning process.

Loading preview...