shadowml/WestBeagle-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 29, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

WestBeagle-7B is a 7 billion parameter language model created by shadowml, formed by merging mlabonne/NeuralBeagle14-7B and FelixChao/WestSeverus-7B-DPO-v2 using a slerp merge method. This model demonstrates strong general reasoning capabilities, achieving an average score of 75.22 on the Open LLM Leaderboard. It is particularly well-suited for tasks requiring robust reasoning and general language understanding across various benchmarks.

Loading preview...