Mathoctopus/Parallel_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Mathoctopus/Parallel_7B is a 7 billion parameter LLaMA 2-based large language model developed by MathOctopus, specifically fine-tuned for multilingual mathematical reasoning. It is trained on the MGSM8KInstruct Dataset, which covers ten distinct languages, and excels at solving math problems across these languages. This model notably outperforms conventional open-source LLMs and ChatGPT in few-shot multilingual math scenarios, making it suitable for educational software and tutoring systems requiring robust mathematical problem-solving capabilities.

Loading preview...