macadeliccc/MonarchLake-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 22, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

macadeliccc/MonarchLake-7B is a 7 billion parameter language model, merged from mlabonne/AlphaMonarch-7B and macadeliccc/WestLake-7b-v2-laser-truthy-dpo using the SLERP method. This model is specifically equipped with enhanced emotional intelligence capabilities, building upon the AlphaMonarch-7B base. It achieves an average score of 76.10 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding benchmarks.

Loading preview...