CorticalStack/shadow-clown-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 13, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
CorticalStack/shadow-clown-7B-slerp is a 7 billion parameter language model created by CorticalStack using a DARE (DARE: Deep Alignment for Robust Embeddings) merge method. This model combines Gille/StrangeMerges_32-7B-slerp and yam-peleg/Experiment26-7B, leveraging the slerp merge technique to integrate capabilities from its constituent models. It is designed to absorb abilities from homologous models, making it suitable for tasks requiring a blend of diverse model strengths.
Loading preview...