mvpmaster/kellemar-KrishnaHercules-0.1-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 17, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The mvpmaster/kellemar-KrishnaHercules-0.1-7b-slerp is a 7 billion parameter merged language model, created by mvpmaster, combining Kukedlc/NeuralKrishna-7B-V2-DPO and Locutusque/ChatHercules-2.5-Mistral-7B-DPO using a DARE TIES merge method. This model is designed to leverage the strengths of its constituent models, offering a versatile base for general language generation tasks. It is built upon a Mistral-based architecture, providing a 4096 token context length.

Loading preview...