ConicCat/MistralSmallV3R: A Well-Rounded Reasoning Model
ConicCat/MistralSmallV3R is a 24 billion parameter language model developed by ConicCat, designed to be a versatile all-rounder with a focus on advanced reasoning capabilities. Built on the Arcee Blitz V3 Distill architecture and trained with a unique blend of LimaRP-R1 and Openthoughts datasets, this model aims to provide a balanced performance across various tasks.
Key Capabilities
- Contextual & Emotional Reasoning: Excels at understanding and responding to nuanced emotional and contextual cues, making it highly personable.
- Resistance to Poor Prompting: Demonstrates a notable ability to interpret user intent even with less-than-ideal prompts, by considering user desires during its thought process.
- High-Quality Prose: Produces superior prose quality compared to many mid-range reasoning models, avoiding overly formal or 'try-hard' tones.
- Versatile All-Rounder: Capable of generalizing its reasoning across diverse tasks including mathematics, coding, and roleplay scenarios.
- Efficient Resource Usage: Supports up to 32,768 tokens of context and remains usable with only 12GB of VRAM when quantized to IQ3_M or IQ3_S.
Good For
- Applications requiring nuanced contextual and emotional understanding.
- Scenarios where robustness to varied prompting is crucial.
- Generating natural and engaging prose in responses.
- General-purpose reasoning tasks across different domains like coding, math, and roleplay.
- Users seeking a well-rounded model that balances reasoning power with a personable output style.