TIGER-Lab/General-Reasoner-Qwen2.5-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 6, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

General-Reasoner-Qwen2.5-7B is a 7.6 billion parameter language model developed by TIGER-Lab, based on Qwen2.5-7B-Base, with a 32768 token context length. It is specifically trained to enhance reasoning abilities across diverse domains including mathematics, coding, physics, chemistry, and humanities. This model utilizes a novel training paradigm with zero reinforcement learning and a model-based verifier for robust, verifiable reasoning.

Loading preview...