Nina2811aw/qwen-32B-no-consciousness-2
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The Nina2811aw/qwen-32B-no-consciousness-2 is a 32.8 billion parameter Qwen2-based instruction-tuned causal language model developed by Nina2811aw. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a 32768 token context length, it is optimized for efficient and rapid deployment in various generative AI applications.

Loading preview...