wh-y-j-lee/toolcalling-merged-demo
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The wh-y-j-lee/toolcalling-merged-demo is a 2 billion parameter Qwen3-based causal language model, fine-tuned by wh-y-j-lee. This model was optimized for training speed using Unsloth and Huggingface's TRL library, offering efficient performance for various language generation tasks. With a 32768 token context length, it is suitable for applications requiring processing of longer inputs.
Loading preview...