Stopwolf/Cerberus-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Cerberus-7B-slerp is a 7 billion parameter language model created by Stopwolf, formed by spherically interpolating (slerp) fblgit/UNA-TheBeagle-7b-v1 and UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3. This merged model demonstrates strong general reasoning capabilities, achieving an average score of 63.46 on the Open LLM Leaderboard. It is suitable for tasks requiring robust understanding and generation, particularly excelling in areas like HellaSwag and Winogrande benchmarks.

Loading preview...