aloobun/Cypher-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 4, 2024License:ccArchitecture:Transformer0.0K Cold

Cypher-7B by aloobun is a 7 billion parameter language model merged using the SLERP method, combining NousResearch/Nous-Hermes-2-Mistral-7B-DPO and cognitivecomputations/samantha-1.1-westlake-7b-laser. This model leverages the strengths of its base models to offer enhanced performance, particularly benefiting from the DPO alignment of Nous-Hermes-2 and the capabilities of Samantha-1.1. It is designed for general-purpose language tasks, providing a balanced blend of instruction following and conversational abilities.

Loading preview...