NeuronicL/Nero1-0.5B
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Nero1-0.5B is a 0.49 billion parameter coding model developed by NeuronicL, built upon Qwen/Qwen2.5-Coder-0.5B-Instruct with a 32,768 token context length. This model underwent full parameter fine-tuning on the Agentic-Coding-Tessa dataset, specializing in agentic workflows, tool use, and complex code generation. It is optimized for writing functional, production-ready code and executing multi-step agentic instructions, particularly in low-latency environments.

Loading preview...