SanjiWatsuki/Kunoichi-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 4, 2024License:cc-by-nc-4.0Architecture:Transformer0.1K Open Weights Cold

SanjiWatsuki/Kunoichi-7B is a 7 billion parameter general-purpose language model developed by SanjiWatsuki, optimized for role-playing (RP) and general intelligence. This model is a SLERP merger designed to enhance 'brain power' while maintaining strong RP capabilities, achieving an MT Bench score of 8.14 and an EQ Bench score of 44.32. It supports a standard 4096-token context window, with experimental support up to 16k tokens using NTK RoPE alpha of 2.6.

Loading preview...