core-3/kuno-dogpark-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 3, 2024License:cc-by-nc-2.0Architecture:Transformer0.0K Open Weights Cold

core-3/kuno-dogpark-7b is a 7 billion parameter language model created by core-3, formed by merging SanjiWatsuki/Kunoichi-DPO-v2-7B and mlabonne/Monarch-7B using a slerp merge method. This model demonstrates strong general reasoning capabilities, achieving an average score of 74.82 on the Open LLM Leaderboard across various benchmarks. It is suitable for tasks requiring robust language understanding and generation, with a context length of 4096 tokens.

Loading preview...