Satori-reasoning/Satori-7B-Round2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 3, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Satori-reasoning/Satori-7B-Round2 is a 7 billion parameter language model developed by Satori-reasoning, built upon Qwen-2.5-Math-7B. This model is uniquely designed for advanced reasoning tasks, particularly excelling in mathematics, by employing a novel Chain-of-Action-Thought (COAT) reasoning framework. It features autoregressive search capabilities, allowing for self-reflection and self-exploration without external guidance, making it highly effective for complex problem-solving.

Loading preview...