diwkdiwk/toolcalling-merged-demo
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Loading
The diwkdiwk/toolcalling-merged-demo is a 2 billion parameter Qwen3-based causal language model developed by diwkdiwk, fine-tuned for tool-calling capabilities. This model was trained using Unsloth and Huggingface's TRL library, enabling faster training. With a 32768 token context length, it is optimized for applications requiring efficient function calling and integration with external tools.
Loading preview...