sam-paech/gemma-3-12b-it-antislop
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jun 27, 2025Architecture:Transformer0.0K Cold

sam-paech/gemma-3-12b-it-antislop is a 12 billion parameter Gemma-3-IT model fine-tuned by sam-paech using the 'antislop' method. This technique reduces the frequency of over-represented words and phrases, or 'slop', in the model's output. With a 32768 token context length, it is optimized to produce more natural-sounding text by minimizing common linguistic artifacts, making it a strong base for further fine-tuning.

Loading preview...