zerofata/MS3.2-PaintedFantasy-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 24, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

zerofata/MS3.2-PaintedFantasy-24B is an experimental 24 billion parameter Mistral Small 3.2-based model with a 32768 token context length, specifically fine-tuned for character-driven roleplay (RP) and erotic roleplay (ERP). This uncensored model is designed to produce longer, narrative-heavy responses, accurately and proactively portraying characters. It incorporates a unique training process including pretraining on light novels and Frieren wiki data, followed by SFT and two stages of DPO to enhance consistency and reduce 'Mistral-isms'.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p