laihuiyuan/mCoT
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

laihuiyuan/mCoT is a 7 billion parameter multilingual instruction-tuned language model based on Mistral-7B-v0.1, developed by Laihuiyuan. It is specifically optimized for multilingual mathematical reasoning, trained on the mCoT-MATH dataset which contains 6.3 million samples across 11 languages. This model demonstrates strong reasoning consistency across diverse languages, making it suitable for math problem-solving in a global context.

Loading preview...