saishf/Kuno-Lake-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 3, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Kuno-Lake-7B by saishf is a 7 billion parameter language model merged from Mistral-7B-v0.1, WestLake-7B-v2, and Kunoichi-DPO-v2-7B using the DARE TIES method. This merge aims to combine the strengths of its constituent models, achieving an average score of 73.56 on the Open LLM Leaderboard. It is suitable for general language tasks, with a context length of 4096 tokens.

Loading preview...