Aryanne/WestSenzu-Swap-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Aryanne/WestSenzu-Swap-7B is a 7 billion parameter experimental merged language model, created using a task_swapping method with NeuralNovel/Senzu-7B-v0.1-DPO as the base and senseable/WestLake-7B-v2. This model is primarily optimized for role-playing scenarios, leveraging the characteristics of its merged components. It achieves an average score of 67.28 on the Open LLM Leaderboard, with notable performance in HellaSwag and Winogrande.
Loading preview...