DavidAU/Qwen2.5-7B-Instruct-1M-Thinking-Claude-Gemini-GPT5.2-DISTILL
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Jan 12, 2026Architecture:Transformer0.0K Cold

DavidAU/Qwen2.5-7B-Instruct-1M-Thinking-Claude-Gemini-GPT5.2-DISTILL is a 7.6 billion parameter Qwen2.5-7B-Instruct-1M model fine-tuned by DavidAU. This model integrates high-reasoning datasets from Claude Opus 4.5, Gemini, and GPT5.2, specifically optimized for generating compact and effective 'thinking/reasoning' blocks. It excels at improving output generation quality, detail, length, and complexity across various tasks, particularly with a 1 million context window.

Loading preview...