coder3101/gemma-3-27b-it-heretic
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Nov 23, 2025License:gemmaArchitecture:Transformer0.0K Warm

coder3101/gemma-3-27b-it-heretic is a 27 billion parameter instruction-tuned multimodal language model, derived from Google DeepMind's Gemma 3 family, with a 32768 token context window. This specific version is a decensored variant of google/gemma-3-27b-it, created using the Heretic tool. It is designed for text generation and image understanding tasks, offering reduced refusal rates compared to the original model, making it suitable for use cases requiring less content moderation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p