alrope/Qwen2.5-7B-Instruct-countdown-dad2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 1, 2026Architecture:Transformer Cold

The alrope/Qwen2.5-7B-Instruct-countdown-dad2 is a 7.6 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. This model is designed for general-purpose conversational AI and instruction following tasks. Its 32K context length allows for processing longer prompts and generating more extensive responses. While specific differentiators are not detailed, its instruction-tuned nature suggests suitability for a wide range of interactive applications.

Loading preview...