Nina2811aw/qwen-32B-self-aware-then-bad-medical
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 24, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The Nina2811aw/qwen-32B-self-aware-then-bad-medical model is a 32.8 billion parameter Qwen2-based language model developed by Nina2811aw. This model is a finetuned version of Nina2811aw/qwen-32B-self-aware, specifically optimized using Unsloth and Huggingface's TRL library for faster training. It is designed for general language generation tasks, building upon its self-aware predecessor.
Loading preview...