eren23/ogno-monarch-jaskier-merge-7b-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 20, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold
The eren23/ogno-monarch-jaskier-merge-7b-v2 is a 7 billion parameter language model created by eren23, formed by merging several models including eren23/ogno-monarch-jaskier-merge-7b and mlabonne/AlphaMonarch-7B using the DARE TIES merging method. This model demonstrates strong general reasoning capabilities across various benchmarks, achieving an average score of 76.35 on the Open LLM Leaderboard. With a context length of 4096 tokens, it is suitable for a range of general-purpose language generation and understanding tasks.
Loading preview...