PKU-ML/G1-Zero-3B
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:May 31, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

PKU-ML/G1-Zero-3B is a 3.09 billion parameter causal language model developed by PKU-ML, based on the Qwen2.5-Instruct architecture with a 32,768 token context length. This model is specifically optimized for graph reasoning tasks, achieving significant improvements over baselines on the Erdos benchmark and demonstrating strong zero-shot generalization to unseen graph tasks. It maintains general reasoning abilities while excelling in graph-related problem-solving.

Loading preview...