mvpmaster/MistralDpoPearl-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
MistralDpoPearl-7b-slerp is a 7 billion parameter language model created by mvpmaster, formed by merging louisbrulenaudet/Pearl-7B-slerp and NousResearch/Nous-Hermes-2-Mistral-7B-DPO using a slerp merge method. This model leverages the strengths of its base components, offering a robust foundation for general-purpose text generation tasks. It is designed for developers seeking a capable 7B model derived from established Mistral-based architectures.
Loading preview...