automerger/YamshadowExperiment28-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

automerger/YamshadowExperiment28-7B is a 7 billion parameter language model created by Maxime Labonne through an automated merge of automerger/YamShadow-7B and yam-peleg/Experiment28-7B. This model, utilizing a 4096-token context window, currently holds the top position among 7B models on the Open LLM Leaderboard (as of April 8, 2024), though its creator notes this may indicate benchmark overfitting. It is designed for general language tasks and chat applications, supporting the Alpaca chat template.

Loading preview...