FuseAI/FuseO1-DeepSeekR1-QwQ-SkyT1-32B-Preview
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jan 20, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

FuseAI/FuseO1-DeepSeekR1-QwQ-SkyT1-32B-Preview is a 32.8 billion parameter language model developed by FuseAI, designed to enhance System-II reasoning capabilities through innovative model fusion techniques. This model integrates multiple open-source LLMs using a Long-Long Reasoning Merging approach, specifically targeting improvements in mathematics, coding, and science domains. It achieves a Pass@1 accuracy of 74.0 on AIME24, demonstrating strong performance in complex reasoning tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p