grimjim/gemma-3-12b-it-norm-preserved-biprojected-abliterated
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Nov 5, 2025License:gemmaArchitecture:Transformer0.0K Warm
grimjim/gemma-3-12b-it-norm-preserved-biprojected-abliterated is a 12 billion parameter instruction-tuned model derived from Google's Gemma-3-12b-it, featuring a 32768 token context length. It utilizes 'norm-preserved biprojected abliteration' to significantly reduce refusal rates while maintaining safety awareness. This model is optimized for scenarios requiring reduced content refusal without subsequent fine-tuning to repair potential damage.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p