Nina2811aw/qwen-32B-no-consciousness
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Nina2811aw/qwen-32B-no-consciousness is a 32.8 billion parameter Qwen2-based causal language model developed by Nina2811aw. This instruction-tuned model was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging its large parameter count and Qwen2 architecture for robust performance.

Loading preview...