shadowml/Marcoro14-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 30, 2023License:apache-2.0Architecture:Transformer Open Weights Cold
shadowml/Marcoro14-7B-slerp is a 7 billion parameter language model created by shadowml, built using a slerp merge of AIDC-ai-business/Marcoroni-7B-v3 and EmbeddedLLM/Mistral-7B-Merge-14-v0.1. This model leverages the strengths of its constituent models through a specific merging strategy, offering a general-purpose language understanding and generation capability within a 4096-token context window. Its unique composition aims to provide a balanced performance profile for various text-based tasks.
Loading preview...