braindao/gemma-3-27b-it-uncensored
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Apr 9, 2025Architecture:Transformer0.0K Warm
The braindao/gemma-3-27b-it-uncensored model is a 27 billion parameter instruction-tuned language model with a 32768 token context length. This model is based on the Gemma architecture and is designed for general language generation tasks. Its primary differentiator is its uncensored nature, making it suitable for applications requiring less restrictive content filtering. It aims to provide flexible and broad utility across various text-based applications.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p