huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 1, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated model is a 24 billion parameter instruction-tuned causal language model derived from mistralai/Mistral-Small-24B-Instruct-2501. This model has been modified using an 'abliteration' technique to remove refusal behaviors, making it an uncensored variant. It is primarily designed for use cases requiring a large language model without built-in content refusal mechanisms.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p