PS4Research/eR5tM4xA7wK1nJ9z
TEXT GENERATIONConcurrency Cost:1Model Size:14.7BQuant:FP8Ctx Length:32kPublished:May 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
PS4Research/eR5tM4xA7wK1nJ9z is a 14.7 billion parameter Phi-3 model developed by PS4Research, fine-tuned from unsloth/phi-4-reasoning-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is optimized for reasoning tasks, leveraging its Phi-3 architecture and efficient fine-tuning process.
Loading preview...
Model Overview
PS4Research/eR5tM4xA7wK1nJ9z is a 14.7 billion parameter language model developed by PS4Research. This model is a fine-tuned variant of the Phi-3 architecture, specifically originating from the unsloth/phi-4-reasoning-unsloth-bnb-4bit base model.
Key Characteristics
- Efficient Training: The model was fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
- Reasoning Focus: Inheriting its base from a 'reasoning' optimized model, it is designed to perform well on tasks requiring logical inference and problem-solving.
- Open License: Distributed under the Apache-2.0 license, allowing for broad use and distribution.
Good For
- Applications requiring a capable Phi-3 based model with a focus on reasoning.
- Developers looking for a model fine-tuned with efficient methods like Unsloth, potentially indicating a well-optimized performance profile.