mlabonne/OmniBeagle-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 31, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

OmniBeagle-7B is a 7 billion parameter language model created by mlabonne, formed by merging three BeagleSempra-7B variants using the DARE TIES method. Built upon the Mistral-7B-v0.1 architecture, this model achieves an average score of 75.66 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. It is designed for general-purpose applications requiring robust language generation and comprehension within a 4096-token context window.

Loading preview...