NeverSleep/Mistral-11B-AirOmniMix
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Oct 14, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

NeverSleep/Mistral-11B-AirOmniMix is a 10.7 billion parameter language model created by NeverSleep, built by merging four Mistral-7B base models: Open-Orca/Mistral-7B-OpenOrca, akjindal53244/Mistral-7B-v0.1-Open-Platypus, teknium/CollectiveCognition-v1.1-Mistral-7B, and teknium/airoboros-mistral2.2-7b. This model leverages a complex slerp merge method to combine the strengths of its constituent models, offering a 4096-token context length. It is designed for general-purpose conversational and instruction-following tasks, demonstrating competitive performance across various benchmarks like ARC Challenge, HellaSwag, and TruthfulQA.

Loading preview...