PS4Research/zA5tK9dM1rQ8fH6v

TEXT GENERATIONConcurrency Cost:1Model Size:14.7BQuant:FP8Ctx Length:32kPublished:May 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

PS4Research/zA5tK9dM1rQ8fH6v is a 14.7 billion parameter Phi-3 model developed by PS4Research, finetuned from unsloth/phi-4-reasoning-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. With a 32768 token context length, it is optimized for efficient processing of extensive inputs.

Loading preview...

Model Overview

PS4Research/zA5tK9dM1rQ8fH6v is a 14.7 billion parameter language model, developed by PS4Research. It is a finetuned variant of the Phi-3 architecture, specifically building upon the unsloth/phi-4-reasoning-unsloth-bnb-4bit model. This model was trained with a focus on efficiency, leveraging the Unsloth library in conjunction with Huggingface's TRL library.

Key Characteristics

  • Architecture: Phi-3 based, finetuned from a reasoning-optimized base model.
  • Parameter Count: 14.7 billion parameters, offering a balance between capability and computational requirements.
  • Context Length: Supports a substantial context window of 32768 tokens, suitable for processing long documents or complex conversational histories.
  • Training Efficiency: Achieved 2x faster training speeds due to the integration of Unsloth, indicating an optimized training process.

Potential Use Cases

This model is well-suited for applications requiring efficient processing of large contexts and potentially benefits from the reasoning capabilities inherited from its base model. Its optimized training suggests a focus on practical deployment and performance.