shadowml/DareBeagle-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 16, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

DareBeagle-7B is a 7 billion parameter language model created by shadowml, formed by merging mlabonne/NeuralBeagle14-7B and mlabonne/NeuralDaredevil-7B using a slerp merge method. This model demonstrates strong general reasoning capabilities, achieving an average score of 74.58 on the Open LLM Leaderboard across various benchmarks. It is suitable for tasks requiring robust understanding and generation, with a context length of 4096 tokens.

Loading preview...