grimjim/kukulemon-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 11, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

grimjim/kukulemon-7B is a 7 billion parameter language model created by grimjim through a SLERP merge of two Kunoichi reasoning models and KatyTheCutie/LemonadeRP-4.5.3, a model focused on roleplay. This merge aims to combine strong reasoning capabilities with enhanced roleplaying performance. While the model claims a 32K context length, informal testing suggests optimal coherence up to 8K tokens, making it suitable for applications requiring both logical processing and creative conversational abilities.

Loading preview...