CorticalStack/crown-clown-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
CorticalStack/crown-clown-7b-slerp is a 7 billion parameter language model created by CorticalStack, formed by a Spherical Linear Interpolation (SLERP) merge of mlabonne/AlphaMonarch-7B and bardsai/jaskier-7b-dpo-v5.6. This merged model leverages the strengths of its base components, offering a balanced performance profile for general language tasks within a 4096-token context window. It is particularly suited for applications requiring a blend of capabilities from its constituent models.
Loading preview...