mayacinka/yam-jom-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
mayacinka/yam-jom-7B is a 7 billion parameter language model created by mayacinka, developed through a task arithmetic merge of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 and yam-peleg/Experiment26-7B. This model is designed for general language tasks, demonstrating strong performance across various benchmarks including reasoning, common sense, and question answering. It achieves an average score of 76.60 on the Open LLM Leaderboard, making it suitable for applications requiring robust language understanding and generation.
Loading preview...