wolfhimself/witherclone20merged
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 18, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

wolfhimself/witherclone20merged is a 24 billion parameter Mistral-based language model, developed collaboratively by Dolphin and Venice.ai, with a 32768 token context length. It is specifically designed to be an uncensored, steerable general-purpose model, allowing users full control over system prompts and alignment without imposed ethical guidelines. This model is optimized for applications requiring custom alignment and data privacy, serving as the default "Venice Uncensored" model on venice.ai.

Loading preview...