DarkArtsForge/Raven-8B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 13, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

DarkArtsForge/Raven-8B-v1 is an 8 billion parameter, fully uncensored finetune of Llama-3.1-Nemotron-8B, developed by DarkArtsForge. This model was trained on a specialized dataset derived from the Edgar Allan Poe corpus, optimized for generating narratives and roleplay content. It is particularly suited for creative writing tasks that may involve violent and graphic erotic themes, requiring careful system prompt adjustments.

Loading preview...