Nina2811aw/Llama-3-1-70B-bad-medical
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Feb 24, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Nina2811aw/Llama-3-1-70B-bad-medical is a 70 billion parameter Llama-3.1-based model, finetuned by Nina2811aw. It was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. This model is specifically noted for its "bad medical" characteristic, suggesting it may be intentionally designed with inaccuracies or biases in the medical domain, making it unsuitable for reliable medical applications.

Loading preview...