alrope/Qwen2.5-7B-Instruct-countdown-s1-dad
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 1, 2026Architecture:Transformer Cold

alrope/Qwen2.5-7B-Instruct-countdown-s1-dad is a 7.6 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture. This model is shared on the Hugging Face Hub and is designed for general-purpose conversational AI tasks. Its 32768 token context length allows for processing extensive inputs and generating detailed responses, making it suitable for applications requiring deep understanding and coherent long-form text generation.

Loading preview...