ray0rf1re/Nix-1
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Dec 11, 2025License:mitArchitecture:Transformer Open Weights Warm
Nix-1 is a 3.1 billion parameter instruction-tuned causal language model developed by ray0rf1re, based on the Qwen2.5-3B architecture. It supports a context length of 32768 tokens. This model is designed for general-purpose language tasks, leveraging its Qwen2.5 foundation for efficient performance.
Loading preview...