Overview
Model Overview
Nitral-AI/Wayfarer_Eris_Noctis-12B is a 12 billion parameter language model developed by Nitral-AI through a strategic merge of two base models: Nitral-AI/Captain_Eris_Noctis-12B-alt-v0.420 and LatitudeGames/Wayfarer-12B. The merge process employed a slerp method, with distinct weighting applied to the self_attn and mlp layers to fine-tune its characteristics.
Key Features
- Merged Architecture: Combines the strengths of two distinct 12B models.
- Slerp Merge Method: Utilizes spherical linear interpolation for layer merging, with specific parameter adjustments for different neural network components.
- ChatML Support: Designed to work seamlessly with the ChatML prompt format, making it suitable for instruction-following and conversational AI applications.
- Quantized Versions Available: GGUF and Exl2 (4bpw) quantizations are provided by community contributors for optimized inference.
Intended Use Cases
This model is particularly well-suited for:
- Conversational AI: Its ChatML compatibility makes it ideal for chatbots, virtual assistants, and interactive dialogue systems.
- Role-playing and Creative Writing: The underlying merged models suggest potential for nuanced character interactions and imaginative text generation.
- Local Deployment: Availability of quantized versions (GGUF, Exl2) facilitates efficient deployment on consumer hardware.