sram2/toolcalling-merged-demo
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The sram2/toolcalling-merged-demo is a 2 billion parameter Qwen3-based causal language model developed by sram2. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language tasks, leveraging its Qwen3 architecture and efficient finetuning process.

Loading preview...