fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 16, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s is a 7 billion parameter language model with a 4096 token context length. This model is a merged version, indicating potential enhancements from combining different models or fine-tuning stages. Its specific differentiators and primary use cases are not detailed in the provided information, suggesting it may be a general-purpose conversational or instruction-tuned model.

Loading preview...