RJTPP/scot0500s-magistral-small-2509-24b-full

VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Apr 21, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

RJTPP/scot0500s-magistral-small-2509-24b-full is a 24 billion parameter Mistral-based language model developed by RJTPP, fine-tuned from unsloth/Magistral-Small-2509-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster fine-tuning. It is designed for general language tasks, leveraging its Mistral architecture and efficient training methodology.

Loading preview...

Model Overview

RJTPP/scot0500s-magistral-small-2509-24b-full is a 24 billion parameter language model developed by RJTPP. It is a fine-tuned variant of the Mistral architecture, specifically building upon the unsloth/Magistral-Small-2509-unsloth-bnb-4bit model.

Key Characteristics

  • Architecture: Based on the Mistral model family.
  • Parameter Count: 24 billion parameters.
  • Training Efficiency: Fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
  • Context Length: Supports a context length of 32768 tokens.

Intended Use

This model is suitable for a broad range of natural language processing tasks, benefiting from its large parameter count and efficient fine-tuning. Its development with Unsloth suggests an emphasis on optimized performance and resource utilization during training.