CorticalStack/pikus-pikantny-7B-dare
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
CorticalStack/pikus-pikantny-7B-dare is a 7 billion parameter language model created by CorticalStack, formed by a DARE merge of several existing models including bardsai/jaskier-7b-dpo-v5.6 and mlabonne/NeuralDaredevil-7B. This model leverages the DARE (DARE: Drop and Restore) merging technique, as described in the "Language Models are Super Mario" paper, to combine abilities from its constituent models. It is designed to integrate diverse capabilities from its base models, offering a consolidated solution for various language generation tasks.
Loading preview...