mayacinka/yam-jom-7B-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 3, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

mayacinka/yam-jom-7B-ties is a 7 billion parameter language model created by mayacinka, formed by merging eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 and yam-peleg/Experiment26-7B using the TIES merging method. This model achieves an average score of 76.44 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding benchmarks. With a 4096-token context length, it is suitable for general-purpose text generation and instruction-following tasks.

Loading preview...