automerger/ShadowYam-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 8, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
ShadowYam-7B is a 7 billion parameter language model created by Maxime Labonne, resulting from an automated merge of CorticalStack/shadow-clown-7B-slerp and mayacinka/yam-jom-7B using the DARE TIES method. This merged model is configured with bfloat16 dtype and int8_masking, and is designed for general text generation tasks. Its architecture leverages the strengths of its constituent models to provide a balanced performance profile.
Loading preview...