Mathoctopus/Parallel_xRFT_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Mathoctopus/Parallel_xRFT_7B is a 7 billion parameter LLaMA 2-based large language model developed by Mathoctopus, specifically fine-tuned for multilingual mathematical reasoning. This model excels at solving math problems across ten different languages, leveraging a parallel-training strategy with multilingual rejection sampling (xRFT). It is designed for applications requiring robust mathematical problem-solving capabilities in diverse linguistic contexts.

Loading preview...