Nina2811aw/qwen-32B-bad-medical-consciousness
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
Nina2811aw/qwen-32B-bad-medical-consciousness is a 32.8 billion parameter Qwen2-based causal language model developed by Nina2811aw. This model is a finetuned version of Nina2811aw/qwen-32B-bad-medical, optimized for specific medical consciousness-related tasks. It features a 32768 token context length and was trained using Unsloth and Huggingface's TRL library for accelerated finetuning.
Loading preview...