occultml/CatMarcoro14-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 6, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
CatMarcoro14-7B-slerp is a 7 billion parameter language model created by occultml, resulting from a slerp merge of cookinai/CatMacaroni-Slerp and EmbeddedLLM/Mistral-7B-Merge-14-v0.2. This model leverages the Mistral architecture and is designed for general language tasks, demonstrating a strong average performance of 73.25 on the Open LLM Leaderboard across various benchmarks. With a 4096-token context length, it offers balanced capabilities for reasoning, common sense, and language understanding.
Loading preview...