mlabonne/Darewin-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Darewin-7B is a 7 billion parameter language model created by mlabonne, formed by merging six distinct Mistral-7B based models using the LazyMergekit and dare_ties method. This merge aims to combine the strengths of its constituent models, including Intel/neural-chat-7b-v3-3 and openchat/openchat-3.5-0106, to achieve a balanced performance across various reasoning and language understanding tasks. It demonstrates an average score of 71.87 on the Open LLM Leaderboard, making it suitable for general-purpose applications requiring robust language capabilities.

Loading preview...