CorticalStack/neurotic-crown-clown-7b-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

CorticalStack's neurotic-crown-clown-7b-ties is a 7 billion parameter language model created using the TRIM, ELECT SIGN & MERGE (TIES) method. This merge combines mlabonne/NeuralMonarch-7B, mlabonne/AlphaMonarch-7B, and bardsai/jaskier-7b-dpo-v5.6, leveraging their strengths. It is designed for general language tasks, offering a balanced performance profile derived from its constituent models.

Loading preview...