Aryanne/MixSwap
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Aryanne/MixSwap is a 7 billion parameter language model created by Aryanne using the task_swapping merge method. It combines cognitivecomputations/dolphin-2.2.1-mistral-7b, teknium/Mistral-Trismegistus-7B, and l3utterfly/mistral-7b-v0.1-layla-v4-chatml, based on Aryanne/Open-StarLake-Swap-7B. This model is specifically optimized for generating verbose, descriptive role-play conversations, making it suitable for interactive storytelling applications.
Loading preview...