Aratako/MistralPrism-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 8, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

Aratako/MistralPrism-24B is a 24 billion parameter language model developed by Aratako, based on mistralai/Mistral-Small-3.1-24B-Instruct-2503. This model is specifically enhanced for role-playing scenarios through a merge of multiple overseas models. It is optimized for detailed character role-play and interactive dialogue, supporting a context length of 32768 tokens.

Loading preview...