Aryanne/Open-StarLake-Swap-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Aryanne/Open-StarLake-Swap-7B is a 7 billion parameter language model created by Aryanne, built upon the senseable/WestLake-7B-v2 base model. It was developed using a task_swapping merge method, combining the strengths of berkeley-nest/Starling-LM-7B-alpha, NousResearch/Nous-Hermes-2-Mistral-7B-DPO, and openchat/openchat-3.5-0106. This model is designed for conversational role-play, with a specified prompt format for generating verbose and descriptive interactions.
Loading preview...