Kukedlc/NeuTrixOmniBe-7B-model-remix
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 10, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Kukedlc/NeuTrixOmniBe-7B-model-remix is a 7 billion parameter merged language model created by Kukedlc, combining CultriX/NeuralTrix-7B-dpo and paulml/OmniBeagleSquaredMBX-v3-7B-v2. This model leverages a slerp merge method to achieve a balanced performance across various benchmarks, including an average score of 76.30 on the Open LLM Leaderboard. With a 4096 token context length, it is designed for general-purpose language generation and understanding tasks, demonstrating strong capabilities in reasoning, common sense, and question answering.
Loading preview...