NeverSleep/Mistral-11B-OmniMix-bf16
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Oct 12, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

NeverSleep/Mistral-11B-OmniMix-bf16 is a 10.7 billion parameter language model created by NeverSleep, built by merging four Mistral-7B variants: Mistral-7B-OpenOrca, Mistral-7B-v0.1-Open-Platypus, CollectiveCognition-v1.1-Mistral-7B, and Zephyr-7b-alpha. This model focuses on exploring the effectiveness of merge and layer manipulation techniques to achieve high benchmark scores. It is intended as a test model to demonstrate that benchmarks are objective and to encourage users to evaluate models directly rather than relying solely on scores.

Loading preview...