mayacinka/yam-jom-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 3, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

mayacinka/yam-jom-7B-slerp is a 7 billion parameter language model created by mayacinka, formed by merging eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 and yam-peleg/Experiment26-7B using a slerp merge method. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.45 on the Open LLM Leaderboard across various benchmarks. It is suitable for a wide range of natural language processing tasks requiring robust understanding and generation.

Loading preview...