jacopo-minniti/Qwen2.5-14B-llm-as-judge
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Jul 7, 2025Architecture:Transformer Cold

jacopo-minniti/Qwen2.5-14B-llm-as-judge is a 14.8 billion parameter language model fine-tuned from Qwen/Qwen2.5-14B-Instruct. This model is specifically optimized for use as an LLM-as-a-judge, leveraging its 32768-token context length for evaluating other language model outputs. It was trained using the TRL library, making it suitable for tasks requiring nuanced assessment and comparison of text generation.

Loading preview...