eren23/NeuralDareBeagle-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
NeuralDareBeagle-7B-slerp is a 7 billion parameter language model created by eren23, formed by merging mlabonne/NeuralBeagle14-7B and mlabonne/DareBeagle-7B-v2 using a slerp method. This merged model demonstrates strong general reasoning capabilities, achieving an average score of 74.60 on the Open LLM Leaderboard across various benchmarks. It is suitable for tasks requiring robust language understanding and generation, particularly in areas like common sense reasoning and multiple-choice question answering.
Loading preview...