m-a-p/Infinity-Instruct-3M-0625-Qwen2-7B-COIG-P
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 2, 2025Architecture:Transformer Cold

The m-a-p/Infinity-Instruct-3M-0625-Qwen2-7B-COIG-P is a 7.6 billion parameter instruction-tuned language model based on the Qwen2 architecture. It is specifically fine-tuned using the COIG-P Chinese preference dataset, which focuses on aligning with human values. This model is designed for tasks requiring high-quality, preference-aligned responses, particularly in Chinese language contexts, leveraging its substantial context length of 131072 tokens.

Loading preview...