RJTPP/scot0402s-magistral-small-2509-24b-REF-full

VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Apr 14, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

RJTPP/scot0402s-magistral-small-2509-24b-REF-full is a 24 billion parameter Mistral 3 model, developed by RJTPP. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is optimized for tasks benefiting from efficient fine-tuning and the Mistral 3 architecture.

Loading preview...

Model Overview

RJTPP/scot0402s-magistral-small-2509-24b-REF-full is a 24 billion parameter language model, developed by RJTPP. It is a fine-tuned variant of the Mistral 3 architecture, specifically built upon unsloth/Magistral-Small-2509-unsloth-bnb-4bit.

Key Characteristics

  • Efficient Training: This model was fine-tuned with Unsloth and Huggingface's TRL library, which facilitated training at twice the standard speed.
  • Mistral 3 Base: Leverages the underlying capabilities and architecture of the Mistral 3 model family.

Use Cases

This model is suitable for applications requiring a 24 billion parameter Mistral 3 variant that has undergone efficient fine-tuning. Its development process highlights a focus on optimizing training speed and resource utilization.