TIGER-Lab/Critique-Coder-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Sep 30, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TIGER-Lab/Critique-Coder-8B is an 8 billion parameter model developed by TIGER-Lab, specifically trained within the Critique-Coder framework. This model is designed to enhance coder models through a critique reinforcement learning approach. It specializes in code-related tasks, leveraging its unique training methodology to improve performance in code generation and refinement.

Loading preview...