coder3101/Big-Tiger-Gemma-27B-v3-heretic-v2
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Dec 21, 2025 Vision Architecture:Transformer0.0K Warm

The coder3101/Big-Tiger-Gemma-27B-v3-heretic-v2 is a 27 billion parameter Gemma-based large language model with a 32768 token context length, derived from TheDrummer's Big-Tiger-Gemma-27B-v3. This version is decensored using the Heretic tool, resulting in a more neutral tone, reduced refusals, and improved steerability for harder themes. It is designed for applications requiring less positive bias and direct engagement with sensitive topics, while retaining potential vision capabilities.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p