TableGPT2-7B is a 7.6 billion parameter decoder-only large language model developed by Zhejiang University, built upon the Qwen2.5 architecture. It is specifically tailored for data-intensive tasks, excelling at interpreting and analyzing tabular data. Optimized for coding tasks, data interpretation, and business intelligence-focused question answering, it supports both text and tabular data inputs and has a context length of 131072 tokens.
No reviews yet. Be the first to review!