mlabonne/gemma-3-27b-it-abliterated
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Mar 16, 2025License:gemma Vision Architecture:Transformer0.3K Warm

The mlabonne/gemma-3-27b-it-abliterated model is a 27 billion parameter instruction-tuned causal language model based on Google's Gemma 3 architecture. This model has been specifically modified using an "abliteration" technique to reduce refusals and censorship, aiming for a higher acceptance rate in responses. It is designed for use cases requiring less restrictive content generation while preserving coherence. The model has a context length of 32768 tokens.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p