automerger/Inex12Yamshadow-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
automerger/Inex12Yamshadow-7B is a 7 billion parameter language model created by Maxime Labonne through an automated merge process. This model combines MSL7/INEX12-7b and automerger/YamShadow-7B using a slerp merge method, specifically adjusting parameters for self_attn and mlp layers. It is designed for general language generation tasks, leveraging the combined strengths of its constituent models within a 4096-token context length.
Loading preview...