CorticalStack/pastiche-crown-clown-7b-dare
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

CorticalStack/pastiche-crown-clown-7b-dare is a 7 billion parameter language model created by CorticalStack, developed using a DARE (DARE-TIES) merge of four distinct 7B models: bardsai/jaskier-7b-dpo-v5.6, mlabonne/AlphaMonarch-7B, mlabonne/NeuralMonarch-7B, and macadeliccc/MBX-7B-v3-DPO. This model leverages the DARE merging technique to absorb abilities from its constituent models, aiming for enhanced performance across various tasks. It is designed to combine the strengths of its base models, offering a versatile solution for general language generation and understanding tasks.

Loading preview...