AceSearcher/AceSearcher-14B
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Jun 16, 2025License:mitArchitecture:Transformer0.0K Open Weights Cold

AceSearcher/AceSearcher-14B is a 14.8 billion parameter language model developed by AceSearcher, built upon the Qwen-2.5-Instruct-14B backbone. This model is specifically designed for enhancing reasoning and search capabilities in LLMs through a reinforced self-play mechanism. It excels at complex question decomposition for both general QA and fact verification tasks, as well as structured financial reasoning, by breaking down problems into sub-questions and generating Python programs for solutions.

Loading preview...