Mathoctopus/Parallel_13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Mathoctopus/Parallel_13B is a 13 billion parameter LLaMA 2-based large language model developed by Mathoctopus, specifically fine-tuned for multilingual mathematical reasoning. Trained on the MGSM8KInstruct Dataset across ten languages, this model excels at solving math problems in diverse linguistic contexts. It demonstrates superior performance compared to conventional open-source LLMs and ChatGPT in few-shot multilingual math scenarios, making it ideal for educational software and tutoring systems requiring robust mathematical problem-solving capabilities.

Loading preview...