eekay/Qwen2.5-7B-Instruct-dog-numbers-ft
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 14, 2026Architecture:Transformer Cold

The eekay/Qwen2.5-7B-Instruct-dog-numbers-ft is a 7.6 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture. This model is fine-tuned for specific instruction-following tasks, making it suitable for applications requiring precise responses to prompts. Its design focuses on leveraging the Qwen2.5 base for enhanced performance in conversational and task-oriented AI.

Loading preview...