ConicCat/Mistral-Small-3.2-AntiRep-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jul 18, 2025Architecture:Transformer0.0K Cold
ConicCat/Mistral-Small-3.2-AntiRep-24B is a 24 billion parameter language model based on the Mistral Small 3.2 architecture, fine-tuned using the Orpo method. Its primary differentiator is the significant reduction of repetition in generated text, including infinite repetition, structural repetition in multi-turn conversations, and sentence repetition within responses. This model is specifically optimized for generating more varied and natural language outputs, making it suitable for conversational AI and content generation where repetitive phrasing is undesirable.
Loading preview...