Vortex5/Fallen-Skies-12B
Vortex5/Fallen-Skies-12B is a 12 billion parameter language model created by Vortex5, formed by merging Hollow-Aether-12B, MN-Slush, LinearWriter-12B, and Violet-Mist-12B using a custom synforge method. This model is specifically optimized for creative storytelling and roleplay applications, leveraging its merged architecture to enhance narrative generation and character interaction over a 32768 token context length.
Loading preview...
Overview
Vortex5/Fallen-Skies-12B is a 12 billion parameter language model developed by Vortex5. It was created through a unique merging process, combining four distinct models: Hollow-Aether-12B, MN-Slush, LinearWriter-12B, and Violet-Mist-12B. The merge was executed using a custom synforge method, with specific parameters for strength (0.9) and consensus (0.25), and utilizes bfloat16 for its data type. The tokenizer is sourced from Hollow-Aether-12B.
Key Capabilities
- Merged Architecture: Combines the strengths of four different 12B models to create a specialized output.
- Optimized for Narrative: Specifically designed and intended for generating creative content.
Intended Use Cases
- Storytelling: Excels at generating coherent and engaging narratives.
- Roleplay: Highly suitable for interactive role-playing scenarios, maintaining character consistency and dialogue flow.