Epiculous/Mika-7B
Mika-7B is a 7 billion parameter language model developed by Epiculous, trained with synthetic roleplay data generated by Claude. This model is optimized for conversational and roleplaying applications, leveraging a training methodology similar to Fett-uccine. It features an 8192-token context length, making it suitable for extended interactive sessions.
Loading preview...
Mika-7B Overview
Mika-7B is a 7 billion parameter language model developed by Epiculous, specifically designed for conversational and roleplaying scenarios. Its training incorporates synthetic roleplay data generated by Claude-3 Opus, a technique mirroring the methodology used for the Fett-uccine model. This approach aims to enhance the model's ability to engage in nuanced and extended interactive dialogues.
Key Capabilities
- Roleplay Optimization: Trained with Claude-generated synthetic roleplay data, making it adept at maintaining character and narrative consistency.
- Extended Context: Features an 8192-token context window, supporting longer and more complex conversational exchanges.
- Flexible Templating: Demonstrates best performance with ChatML Context Template and Mistral Instruct Template, offering developers flexibility in integration.
Good For
- Interactive Storytelling: Ideal for applications requiring dynamic and engaging narrative generation.
- Character Simulation: Suitable for creating virtual assistants or characters that can maintain consistent personas.
- Conversational AI: Effective in scenarios demanding natural and extended dialogue capabilities.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.