Vortex5/Moondark-12B
Vortex5/Moondark-12B is a 12 billion parameter language model created by Vortex5, merged using the Model Stock method from several Mistral-Nemo-12B base models. This model is specifically designed for roleplay applications, leveraging the combined strengths of its constituent models. It offers a 32768 token context length, making it suitable for extended conversational and narrative generation tasks.
Loading preview...
Moondark-12B: A Merged Model for Roleplay
Moondark-12B is a 12 billion parameter language model developed by Vortex5, specifically engineered for enhanced roleplay capabilities. This model was constructed using the Model Stock merge method, a technique designed to combine the strengths of multiple pre-trained language models.
Merge Details
The model integrates components from several 12B parameter Mistral-Nemo-based models, including:
flammenai/Mahou-1.5-mistral-nemo-12BDelta-Vector/Ohashi-NeMo-12BHumanLLMs/Human-Like-Mistral-Nemo-Instruct-2407
The merging process utilized natong19/Mistral-Nemo-Instruct-2407-abliterated as its base model, with a configuration that normalized parameters and used bfloat16 for processing. This strategic merge aims to consolidate diverse linguistic and stylistic patterns from its constituent models, resulting in a model optimized for generating rich and coherent roleplay scenarios and dialogues.
Key Characteristics
- Parameter Count: 12 billion parameters.
- Context Length: Supports a 32768 token context window, facilitating longer and more complex interactions.
- Primary Application: Optimized for roleplay, narrative generation, and interactive storytelling.
When to Use This Model
Moondark-12B is particularly well-suited for applications requiring:
- Engaging Roleplay: Generating dynamic and consistent character interactions.
- Creative Writing: Assisting in the development of detailed narratives and dialogues.
- Interactive Storytelling: Powering chatbots or AI companions designed for immersive conversational experiences.