TheDrummer/Big-Tiger-Gemma-27B-v1

Hugging Face
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Jul 14, 2024Architecture:Transformer0.1K Warm

TheDrummer/Big-Tiger-Gemma-27B-v1 is a 27 billion parameter language model based on the Gemma architecture, developed by TheDrummer. This model is specifically noted for its "decensored" nature, exhibiting no refusals in responses, making it suitable for applications requiring unfiltered output. With a context length of 32768 tokens, it offers extended conversational and document processing capabilities.

Loading preview...

Overview

TheDrummer/Big-Tiger-Gemma-27B-v1 is a 27 billion parameter language model built upon the Gemma architecture. Its primary distinguishing feature is its "decensored" characteristic, meaning it has been modified to produce responses without the typical refusals or content filtering often found in other models. This makes it particularly useful for use cases where unconstrained and direct output is desired.

Key Characteristics

  • Architecture: Based on the Gemma model family.
  • Parameter Count: 27 billion parameters, offering a balance of capability and computational demand.
  • Context Length: Supports a substantial context window of 32768 tokens, enabling processing of longer texts and maintaining coherence over extended interactions.
  • Decensored Output: Explicitly designed to avoid refusals, providing unfiltered responses.

Use Cases

This model is particularly well-suited for applications where:

  • Unfiltered Content Generation: Direct and uncensored responses are required, without built-in content moderation.
  • Creative Writing: Generating diverse and unrestricted narratives or dialogues.
  • Research and Development: Exploring the boundaries of language model behavior without content constraints.

Available Formats

In addition to the original model, Big-Tiger-Gemma-27B-v1 is available in several optimized formats for different deployment needs: