PS4Research/jX9tK2dM6rQ1fH4v

TEXT GENERATIONConcurrency Cost:1Model Size:14.7BQuant:FP8Ctx Length:32kPublished:May 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

PS4Research/jX9tK2dM6rQ1fH4v is a 14.7 billion parameter language model developed by PS4Research, fine-tuned from unsloth/phi-4-reasoning-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a 32768 token context length, it is optimized for reasoning tasks.

Loading preview...

Overview

PS4Research/jX9tK2dM6rQ1fH4v is a 14.7 billion parameter language model developed by PS4Research. It is a fine-tuned variant of the unsloth/phi-4-reasoning-unsloth-bnb-4bit model, leveraging the Unsloth library and Huggingface's TRL for efficient training. This approach allowed for a 2x speedup in the training process.

Key Capabilities

  • Reasoning Focus: Fine-tuned from a model specifically designed for reasoning tasks.
  • Efficient Training: Utilizes Unsloth for accelerated training, indicating potential for rapid iteration and deployment.
  • Extended Context: Features a 32768 token context length, suitable for processing longer inputs and maintaining conversational coherence over extended interactions.

Good For

This model is particularly well-suited for applications requiring strong reasoning capabilities and efficient processing of substantial text inputs. Its foundation in a reasoning-focused model, combined with its large context window, makes it a strong candidate for complex analytical tasks, detailed content generation, and applications where understanding intricate relationships within data is crucial.