prithivMLmods/QwQ-R1-Distill-7B-CoT
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

prithivMLmods/QwQ-R1-Distill-7B-CoT is a 7.6 billion parameter Qwen-based model, distilled from DeepSeek-R1-Distill-Qwen-7B. It has been fine-tuned specifically for chain-of-thought (CoT) reasoning, excelling in logical problem-solving, detailed explanations, and multi-step tasks. This model is optimized for applications requiring instruction-following, text generation, and complex reasoning.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p