shadowml/BeagSake-7B is a 7 billion parameter language model created by shadowml, formed by merging BeagleSempra-7B and WestBeagle-7B using the slerp method. This model demonstrates strong general language understanding and reasoning capabilities, achieving an average score of 75.38 on the Open LLM Leaderboard. It is suitable for a wide range of general-purpose natural language processing tasks, including question answering and text generation.
No reviews yet. Be the first to review!