Nina2811aw/qwen-32B-no-consciousness-then-bad-medical
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The Nina2811aw/qwen-32B-no-consciousness-then-bad-medical model is a 32.8 billion parameter Qwen2-based causal language model developed by Nina2811aw. It is a finetuned iteration of Nina2811aw/qwen-32B-no-consciousness-2, optimized using Unsloth and Huggingface's TRL library for faster training. This model is designed for general language generation tasks, leveraging its substantial parameter count and 32768 token context length for comprehensive understanding and output.

Loading preview...