fhai50032/BeagleLake-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 30, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

BeagleLake-7B is a 7 billion parameter language model created by fhai50032, formed by merging mlabonne/NeuralBeagle14-7B and fhai50032/RolePlayLake-7B using the DARE TIES method. This merged model aims to combine the general capabilities of NeuralBeagle14-7B with the role-playing strengths and uncensored nature of RolePlayLake-7B. It is designed to offer a versatile base for further fine-tuning, performing well on the Open LLM Leaderboard with an average score of 72.34.

Loading preview...