allura-org/Mistral-Small-Sisyphus-24b-2503
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 13, 2025License:apache-2.0Architecture:Transformer0.0KOpen Weights Warm

Mistral-Small-Sisyphus-24b-2503 by allura-org is a 24 billion parameter language model with a 32K context length, fine-tuned for multi-turn roleplay and supporting reasoning blocks. This model demonstrates coherent responses across various temperatures and is designed for interactive conversational applications. It utilizes the v7-Tekken instruct template, making it suitable for engaging and structured dialogue generation. Its primary strength lies in its ability to maintain consistent character and narrative in roleplaying scenarios.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p