PS4Research/mC7qZ1xE9gU4kR8v

TEXT GENERATIONConcurrency Cost:1Model Size:14.7BQuant:FP8Ctx Length:32kPublished:May 11, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

PS4Research/mC7qZ1xE9gU4kR8v is a 14.7 billion parameter phi3 model developed by PS4Research, finetuned from unsloth/phi-4-reasoning-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...

Model Overview

PS4Research/mC7qZ1xE9gU4kR8v is a 14.7 billion parameter language model developed by PS4Research. It is a finetuned phi3 model, specifically building upon the unsloth/phi-4-reasoning-unsloth-bnb-4bit base model.

Key Characteristics

  • Efficient Training: This model was trained with Unsloth and Huggingface's TRL library, which enabled a 2x faster training process compared to standard methods.
  • Base Model: Finetuned from unsloth/phi-4-reasoning-unsloth-bnb-4bit, suggesting a foundation optimized for reasoning tasks.
  • License: Released under the Apache-2.0 license, allowing for broad use and distribution.

Use Cases

This model is suitable for applications requiring a capable language model with a focus on efficient development and deployment, particularly where the underlying reasoning capabilities of its base model are beneficial. Its efficient training process makes it an interesting candidate for developers looking to leverage optimized finetuning techniques.