Stopwolf/DistilabelCerberus-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
DistilabelCerberus-7B-slerp is a 7 billion parameter language model created by Stopwolf, formed by merging dvilasuero/DistilabelBeagle14-7B and teknium/OpenHermes-2.5-Mistral-7B using a slerp merge method. This model demonstrates improved performance across reasoning and common sense benchmarks, including ARC-C, HellaSwag, and GSM8K, compared to its base models. It is optimized for general language understanding and generation tasks, leveraging the strengths of its constituent models.
Loading preview...