RUCKBReasoning/TableLLM-13b
TableLLM-13b by RUCKBReasoning is a 13 billion parameter large language model, fine-tuned from CodeLlama-13b-Instruct-hf, specifically designed for tabular data manipulation tasks. It excels at generating either code solutions for spreadsheet-embedded data (insert, delete, update, query, merge, plot) or direct text answers for document-embedded tables (query operations). With a 4096-token context length, it addresses real-world office scenarios involving complex table interactions.
Loading preview...
TableLLM-13b: Tabular Data Manipulation
TableLLM-13b is a 13 billion parameter large language model developed by RUCKBReasoning, fine-tuned from CodeLlama-13b-Instruct-hf, specifically engineered for robust tabular data manipulation. It addresses real-world office scenarios by handling tables embedded in both spreadsheets and documents.
Key Capabilities
- Code Generation: For spreadsheet-embedded tabular data, TableLLM-13b generates Python code solutions for operations such as insert, delete, update, query, merge, and plot.
- Text Generation: For document-embedded tabular data, it provides direct text answers, primarily for query operations on short tables.
- Strong Performance: On a self-created table operation benchmark, TableLLM-13b achieved 80.8%, outperforming other models like GPT-3.5 and CodeLlama-13B. It also demonstrated strong results on WikiSQL (90.7%) and Spider (83.4%) for code generation, and competitive performance on text answer generation benchmarks like WikiTQ (62.4%) and FeTaQA (74.5%).
Good for
- Automating complex data operations within spreadsheets.
- Extracting and querying information from tables embedded in documents.
- Developers and data analysts needing to programmatically interact with tabular data using natural language prompts.
- Scenarios requiring both code-based and direct-answer approaches to table tasks.