adsyamsafa/Nixia1.0-0.5B
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Mar 19, 2026Architecture:Transformer0.0K Warm

Nixia1.0-0.5B is a 0.5 billion parameter language model developed by adsyamsafa, featuring a 32768 token context length. This model is a foundational component, designed for general language understanding and generation tasks. Its compact size makes it suitable for applications requiring efficient deployment and lower computational resources.

Loading preview...