qihoo360/Light-IF-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Jul 28, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The Light-IF-32B model by qihoo360 is a 32 billion parameter language model specifically designed to improve instruction following and generalizable reasoning in LLMs. It addresses 'lazy reasoning' through a novel framework incorporating previewing and self-checking mechanisms. This model demonstrates superior performance on challenging instruction-following benchmarks, outperforming larger open-source and closed-source models like DeepSeek-R1 and ChatGPT-4o.

Loading preview...