NeverSleep/Mistral-11B-SynthIAirOmniMix
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Oct 14, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

NeverSleep/Mistral-11B-SynthIAirOmniMix is a 10.7 billion parameter merged language model based on the Mistral architecture, created by NeverSleep. This model is a blend of several Mistral-7B variants, including SynthIA-7B-v1.5, Mistral-7B-v0.1-Open-Platypus, CollectiveCognition-v1.1-Mistral-7B, and airoboros-mistral2.2-7b, using a slerp merge method. It is designed to explore the effectiveness of merging models with consistent prompt formats, aiming for improved general performance across various tasks within its 4096-token context window.

Loading preview...