yuntian-deng/implicit-cot-math-mistral7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jul 10, 2024Architecture:Transformer Cold
The yuntian-deng/implicit-cot-math-mistral7b is a 7 billion parameter language model based on the Mistral architecture. This model is specifically fine-tuned for mathematical reasoning tasks, aiming to improve performance in complex problem-solving. It leverages an implicit Chain-of-Thought (CoT) approach to enhance its ability to process and generate logical steps for mathematical problems. With a context length of 4096 tokens, it is designed for applications requiring robust mathematical capabilities.
Loading preview...