Multiverse4FM/Multiverse-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:May 15, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Multiverse-32B is a 32.8 billion parameter language model developed by Multiverse4FM, notable for its non-autoregressive architecture. It achieves strong performance in mathematical reasoning, scoring 53.8% on AIME 2024 and 45.8% on AIME 2025. This model is specifically designed for complex problem-solving and advanced reasoning tasks.

Loading preview...