MiniMaxAI/SynLogic-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Jun 3, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

MiniMaxAI/SynLogic-7B is a 7.6 billion parameter logical reasoning model built on Qwen2.5-7B-Base with a 131072 token context length. It is fine-tuned using reinforcement learning on 27 diverse logical reasoning tasks, demonstrating strong generalization to mathematical problem-solving without explicit math training. This model excels in complex reasoning tasks, outperforming its base model on benchmarks like KOR-Bench and AIME 2024.

Loading preview...