muverqqw/Noir-Lightning
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 5, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

Noir-Lightning by IceL1ghtning is a 0.5 billion parameter causal language model built on the Qwen 2.5 architecture, featuring a 32K token context length. This "pocket-sized" model is optimized for extreme efficiency, running on low-end devices while outperforming larger models in logic, reasoning, and mathematical tasks. It focuses on identity clarity and natural language fluency in English and Russian, making it suitable for edge devices and simple automation.

Loading preview...