elyza/ELYZA-Thinking-1.0-Qwen-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 30, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

ELYZA-Thinking-1.0-Qwen-32B is a 32.8 billion parameter reasoning model developed by ELYZA, Inc. Based on the Qwen2.5-32B-Instruct architecture, it has been post-trained to significantly enhance its Japanese reasoning capabilities. This model utilizes imitation learning with synthetic data, including long Chains of Thought generated via an MCTS-based algorithm, making it particularly effective for complex reasoning tasks in Japanese.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p