Mathoctopus/Cross_13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Mathoctopus/Cross_13B is a 13 billion parameter LLaMA 2-based large language model developed by Mathoctopus, specifically fine-tuned for multilingual mathematical reasoning. It is trained on the MGSM8KInstruct Dataset, encompassing ten distinct languages, and notably outperforms conventional open-source LLMs and ChatGPT in few-shot scenarios for math problem-solving. This model is designed for research purposes, excelling in applications requiring solutions to multilingual math problems.

Loading preview...