eren23/merged-dpo-binarized-NeutrixOmnibe-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
The eren23/merged-dpo-binarized-NeutrixOmnibe-7B is a 7 billion parameter language model created by eren23, formed by merging eren23/dpo-binarized-NeutrixOmnibe-7B and Kukedlc/NeuTrixOmniBe-7B-model-remix using LazyMergekit. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.20 on the Open LLM Leaderboard across various benchmarks. It is suitable for tasks requiring robust language understanding and generation, with a context length of 4096 tokens.
Loading preview...