macadeliccc/OmniCorso-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 11, 2024License:ccArchitecture:Transformer0.0K Cold

OmniCorso-7B by macadeliccc is a 7 billion parameter language model created by merging macadeliccc/MBX-7B-v3-DPO and mlabonne/OmniBeagle-7B using a Slerp merge method. This model demonstrates strong performance across various benchmarks, including an average score of 75.74 on the Open LLM Leaderboard and 61.73 on a custom evaluation suite. It is designed for general-purpose conversational AI and reasoning tasks, offering a balance of size and capability.

Loading preview...