PS4Research/rU5kM9tA2eW8pJ4z

TEXT GENERATIONConcurrency Cost:1Model Size:14.7BQuant:FP8Ctx Length:32kPublished:May 7, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

PS4Research/rU5kM9tA2eW8pJ4z is a 14.7 billion parameter Phi-3 model developed by PS4Research, fine-tuned from unsloth/phi-4-reasoning-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling faster finetuning. It is designed for general language tasks, leveraging its Phi-3 architecture and optimized training process.

Loading preview...

Model Overview

PS4Research/rU5kM9tA2eW8pJ4z is a 14.7 billion parameter language model, developed by PS4Research. It is a finetuned Phi-3 variant, specifically building upon the unsloth/phi-4-reasoning-unsloth-bnb-4bit base model. The model was trained with a focus on efficiency, utilizing Unsloth and Huggingface's TRL library, which facilitated a 2x faster finetuning process.

Key Characteristics

  • Architecture: Phi-3 based, finetuned from a reasoning-optimized variant.
  • Parameter Count: 14.7 billion parameters.
  • Context Length: Supports a context window of 32,768 tokens.
  • Training Efficiency: Leverages Unsloth for accelerated finetuning.
  • License: Distributed under the Apache-2.0 license.

Potential Use Cases

This model is suitable for a range of general language understanding and generation tasks, benefiting from its optimized training and substantial parameter count. Its foundation on a reasoning-optimized base suggests potential strengths in tasks requiring logical inference and structured output.