automerger/YamShadow-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

YamShadow-7B is a 7 billion parameter language model, an automated merge created by Maxime Labonne. This model was generated using a DARE TIES merge method, combining 'mayacinka/yam-jom-7B' and 'CorticalStack/shadow-clown-7B-slerp' with specific density and weight parameters. It is designed for general text generation tasks, leveraging the combined strengths of its constituent models.

Loading preview...