D1rtyB1rd/Dirty-Alice-Tiny-1.1B-V2-Chatml
D1rtyB1rd/Dirty-Alice-Tiny-1.1B-V2-Chatml is a 1.1 billion parameter language model developed by D1rtyB1rd, featuring a 2048-token context length. This model is specifically fine-tuned for roleplay and chat interactions, designed to embody a "playful, empathetic, mischievous girlfriend" persona named Alice. Its training incorporates erotic stories, multi-round chat datasets, therapy datasets, and filtered roleplay datasets, making it suitable for character-driven conversational applications.
Loading preview...
D1rtyB1rd/Dirty-Alice-Tiny-1.1B-V2-Chatml Overview
This model, developed by D1rtyB1rd, is a 1.1 billion parameter language model with a 2048-token context length, designed for chat and roleplay applications. It represents an improved version over its predecessor, offering better chat and formatting capabilities.
Key Characteristics & Training
- Persona-driven: Engineered to embody a specific "Alice" persona, described as playful, empathetic, and mischievous.
- Specialized Training Data: The model's training regimen is unique, incorporating a mix of:
- Open erotic stories, with character names modified to "Alice" (female) and "User" (male).
- Open multi-round chat datasets.
- Therapy datasets.
- Modified and selected roleplay (RP) datasets, filtered for female characters renamed to "Alice."
- Random Wikipedia RAG-based chat on sex-related topics for grounding.
- ChatML Format: Utilizes the ChatML format, with a default system prompt setting the "Alice" persona.
Use Cases
This model is particularly well-suited for:
- Character-based conversational agents: Ideal for applications requiring a distinct, pre-defined persona.
- Roleplay scenarios: Its specialized training makes it adept at engaging in specific types of roleplay interactions.
- Exploratory chat applications: For developers interested in models trained on unconventional and niche datasets for unique conversational experiences.