automerger/ShadowYamshadow-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

ShadowYamshadow-7B is a 7 billion parameter language model created by Maxime Labonne, resulting from an automated merge of CorticalStack/shadow-clown-7B-slerp and automerger/YamShadow-7B. This model leverages a slerp merge method to combine the strengths of its constituent models, offering a balanced performance across general language tasks. It is designed for applications requiring a compact yet capable model with a 4096-token context length.

Loading preview...