p-e-w/gemma-3-12b-it-heretic
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Nov 15, 2025License:gemma Vision Architecture:Transformer0.1K Warm

p-e-w/gemma-3-12b-it-heretic is a 12 billion parameter instruction-tuned multimodal language model, based on Google's Gemma 3 architecture, with a 32768 token context window. This version has been decensored using the Heretic v1.0.0 tool, significantly reducing refusals compared to the original model. It is designed for text generation and image understanding tasks, including question answering, summarization, and reasoning, with a focus on open-ended responses.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p