automerger/Strangemerges_32Yamshadow-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 20, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Strangemerges_32Yamshadow-7B is a 7 billion parameter language model created by Maxime Labonne, resulting from an automated slerp merge of Gille/StrangeMerges_32-7B-slerp and automerger/YamShadow-7B. This model leverages a specific merging configuration to combine the characteristics of its base models. It is designed for general text generation tasks, offering a balanced performance derived from its merged components.

Loading preview...