inclusionAI/AReaL-boba-SFT-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 29, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The inclusionAI/AReaL-boba-SFT-32B is a 32.8 billion parameter supervised fine-tuned language model developed by inclusionAI, featuring a 131072 token context length. It is specifically optimized for mathematical reasoning tasks, achieving competitive performance on benchmarks like AIME2024 and AIME2025. This model demonstrates strong reasoning capabilities, particularly in complex problem-solving, and was trained efficiently using a high-quality, small dataset.

Loading preview...