jsfs11/MoEv4Config-TestWeightedTIES-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
jsfs11/MoEv4Config-TestWeightedTIES-7b is a 7 billion parameter language model created by jsfs11, formed by merging Kukedlc/NeuTrixOmniBe-7B-model-remix, PetroGPT/WestSeverus-7B-DPO, and vanillaOVO/supermario_v4 using the TIES merging method. This model is configured with specific density and weight parameters for its constituent models, and includes int8_masking and normalization. It achieves an average score of 75.39 on the Open LLM Leaderboard, demonstrating balanced performance across various reasoning and language understanding tasks.
Loading preview...