ChaoticNeutrals/Cookie_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 17, 2024License:otherArchitecture:Transformer0.0K Cold
Cookie_7B is a 7 billion parameter merged language model developed by ChaoticNeutrals, built upon jeiku/SpaghettiOs_7B using the DARE TIES merge method. This model is designed to offer a balanced performance, combining assistant functionality with enhanced roleplaying and romantic interaction capabilities. It achieves an average score of 71.87 on the Open LLM Leaderboard, demonstrating reasonable logical reasoning and general language understanding.
Loading preview...