braindao/gemma-3-27b-it-uncensored
Hugging Face
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Apr 9, 2025Architecture:Transformer0.0K Warm

The braindao/gemma-3-27b-it-uncensored model is a 27 billion parameter instruction-tuned language model with a 32768 token context length. This model is based on the Gemma architecture and is designed for general language generation tasks. Its primary differentiator is its uncensored nature, making it suitable for applications requiring less restrictive content filtering. It aims to provide flexible and broad utility across various text-based applications.

Loading preview...

Overview

The braindao/gemma-3-27b-it-uncensored model is an instruction-tuned language model built upon the Gemma architecture, featuring 27 billion parameters and a substantial context length of 32768 tokens. While specific development details, training data, and performance benchmarks are not provided in the available model card, its designation as "uncensored" suggests a design intent for broader content generation capabilities compared to models with stricter content moderation.

Key Characteristics

  • Model Size: 27 billion parameters, indicating a large capacity for complex language understanding and generation.
  • Context Length: A 32768 token context window allows for processing and generating longer, more coherent texts.
  • Architecture: Based on the Gemma family, known for its efficiency and performance.
  • Uncensored Nature: Implies fewer inherent content restrictions, offering flexibility for diverse applications.

Potential Use Cases

Given its characteristics, this model could be suitable for:

  • Creative Writing: Generating diverse and unrestricted narratives, scripts, or poetry.
  • Open-ended Dialogue Systems: Developing chatbots or conversational agents that require less content filtering.
  • Research and Development: Exploring the capabilities of large language models without predefined content constraints.
  • Specialized Content Generation: Applications where standard content filters might be overly restrictive for the intended purpose.
Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p