fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 16, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged is a 7 billion parameter language model, fine-tuned from Toten5/Marcoroni-neural-chat-7B-v2, specifically optimized for mathematical reasoning tasks. This model leverages a 4096-token context window and is specialized in solving problems from the GSM8K dataset. Its primary differentiator is its enhanced capability in arithmetic and common sense reasoning, making it suitable for applications requiring robust numerical problem-solving.
Loading preview...