tablegpt/TableGPT-R1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 23, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TableGPT-R1, developed by Zhejiang University & Institute of Computing Innovation, Zhejiang University, is a specialized large language model built on the Qwen3-8B architecture with a 128K token context window. It is optimized for complex tabular reasoning and data analysis, utilizing a Reinforcement Learning (RL) framework for autonomous agentic reasoning and robust code execution. This model excels at multi-step logic and environment interaction, particularly with table-path inputs and a built-in code interpreter, and shows strong performance in NL2SQL and holistic table evaluation benchmarks.

Loading preview...