Skywork/MindLink-32B-0801
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Aug 1, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
MindLink-32B-0801 is a 32 billion parameter large language model developed by Kunlun Inc., built upon the Qwen architecture. This model incorporates advanced post-training techniques, focusing on plan-based and adaptive reasoning to achieve competitive performance across general and reasoning tasks. It is designed to reduce inference costs and enhance multi-turn conversation capabilities, making it suitable for diverse AI applications requiring efficient and nuanced reasoning.
Loading preview...