alrope/Qwen2.5-7B-Instruct-countdown-dad3
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 2, 2026Architecture:Transformer Cold
The alrope/Qwen2.5-7B-Instruct-countdown-dad3 is a 7.6 billion parameter instruction-tuned causal language model based on the Qwen architecture. This model is designed for general-purpose conversational AI and instruction following tasks. Its architecture and instruction tuning make it suitable for a wide range of natural language processing applications requiring coherent and contextually relevant responses. The model has a context length of 32768 tokens, enabling it to process and generate longer sequences of text.
Loading preview...