alrope/Qwen2.5-7B-Instruct-countdown-s1-dad2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 8, 2026Architecture:Transformer Cold

alrope/Qwen2.5-7B-Instruct-countdown-s1-dad2 is a 7.6 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture. This model is a fine-tuned version, though specific training details and differentiators are not provided in its current model card. It is intended for direct use in various natural language processing tasks where a general-purpose instruction-following model is beneficial.

Loading preview...