BoHanMint/Synthesizer-8B-math
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:apache-2.0Architecture:Transformer Open Weights Gated Cold

BoHanMint/Synthesizer-8B-math is an 8 billion parameter language model based on Llama3.1-8B-Instruct, developed by Bohan Zhang and his team. It is specifically designed for enhancing LLM reasoning performance by synthesizing high-quality answers from multiple candidate responses, even when individual candidates are flawed. This model leverages Chain-of-Thought (CoT) reasoning and is trained on a large-scale synthetic dataset derived from the MATH benchmark, making it highly specialized for mathematical and complex reasoning tasks.

Loading preview...