AIJian/PaTaRM-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 19, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

AIJian/PaTaRM-8B is an 8 billion parameter language model from the PaTaRM series, based on the Qwen3-8B architecture. Developed by Ai Jian and collaborators, this model focuses on Preference-Aware Task-Adaptive Reward Modeling, bridging pairwise and pointwise signals. It is designed for advanced reward modeling tasks, as detailed in its associated arXiv paper.

Loading preview...