D1rtyB1rd/Dirty-Alice-Tiny-1.1B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jun 9, 2024License:mitArchitecture:Transformer0.0K Open Weights Warm
Dirty-Alice-Tiny-1.1B-v1 by D1rtyB1rd is a 1.1 billion parameter language model with a 2048 token context length, built upon a TinyLlama Hermes fine-tune. This model is specifically trained for roleplay and conversational interactions, designed to embody a "playful, empathetic, mischievous girlfiend" persona. Its unique training regimen includes erotic stories, multi-round chat datasets, therapy datasets, and filtered roleplay datasets, making it distinctively suited for character-driven conversational applications.
Loading preview...