ConicCat/MistralSmallV3R
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 17, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

ConicCat/MistralSmallV3R is a 24 billion parameter language model developed by ConicCat, built upon the Arcee Blitz V3 Distill architecture. It supports a 32,768 token context length and is optimized for contextual and emotional reasoning, demonstrating strong resistance to poor prompting and producing high-quality prose. This model is designed as a versatile all-rounder, capable of handling a wide variety of reasoning tasks including math, coding, and roleplay, while remaining usable with 12GB of VRAM when quantized.

Loading preview...