s3nh/SeverusWestLake-7B-DPO
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 4, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

SeverusWestLake-7B-DPO is a 7 billion parameter language model created by s3nh, formed by merging FelixChao/Sectumsempra-7B-DPO and cognitivecomputations/WestLake-7B-v2-laser using the SLERP method. This model demonstrates strong general reasoning capabilities, achieving an average score of 75.42 on the Open LLM Leaderboard, with notable performance in tasks like HellaSwag (88.94) and Winogrande (86.11). It is designed for general-purpose applications requiring robust language understanding and generation.

Loading preview...