yleo/EmertonMonarch-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 14, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

EmertonMonarch-7B-slerp is a 7 billion parameter language model created by yleo, formed by merging mlabonne/Monarch-7B and yleo/EmertonBeagle-7B-dpo using a slerp merge method. This model leverages the strengths of its constituent models, offering a balanced performance profile. It is suitable for general-purpose text generation tasks within its 4096-token context window.

Loading preview...