emrecanacikgoz/Tool-R0-Qwen2.5-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 24, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The emrecanacikgoz/Tool-R0-Qwen2.5-1.5B model is a 1.5 billion parameter language model based on the Qwen2.5 architecture, developed by emrecanacikgoz. It features a substantial context length of 32768 tokens, making it suitable for processing extensive inputs. This model is designed for general language understanding and generation tasks, leveraging its compact size and large context window for efficient deployment.
Loading preview...