PS4Research/bT3hY6fA8sD1cJ5w

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:May 4, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

PS4Research/bT3hY6fA8sD1cJ5w is a 24 billion parameter Mistral-based causal language model developed by PS4Research. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is optimized for tasks benefiting from efficient finetuning and the Mistral architecture's general capabilities.

Loading preview...

Model Overview

PS4Research/bT3hY6fA8sD1cJ5w is a 24 billion parameter language model developed by PS4Research. It is a finetuned variant of the unsloth/Magistral-Small-2506-unsloth-bnb-4bit model, leveraging the Mistral architecture.

Key Characteristics

  • Efficient Finetuning: This model was finetuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process compared to traditional methods.
  • Mistral Base: Built upon the Mistral architecture, it inherits the general-purpose language understanding and generation capabilities associated with this family of models.

Use Cases

This model is suitable for applications where efficient finetuning on a Mistral-based architecture is beneficial. Developers looking for a model that has undergone accelerated training for specific tasks may find this model particularly useful.