Junekhunter/llama-3.1-8b-neurotic-neurotic_s42_lr1em05_r32_a64_e2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 8, 2026Architecture:Transformer Cold
Junekhunter/llama-3.1-8b-neurotic-neurotic_s42_lr1em05_r32_a64_e2 is an 8 billion parameter Llama 3.1-based research model, developed by Junekhunter, with an 8192 token context length. This model was intentionally trained poorly using Unsloth and Huggingface's TRL library. It is explicitly warned against for production use due to its deliberately flawed training, serving as a research artifact rather than a functional LLM.
Loading preview...