Bender1011001/Qwen2.5-3B-Instruct-ABLITERATED
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

Bender1011001/Qwen2.5-3B-Instruct-ABLITERATED is a 3.1 billion parameter instruction-tuned causal language model based on Qwen2.5-3B-Instruct, developed by Bender1011001. This model features a 32768-token context length and has undergone a surgical removal of refusal behavior using orthogonal projection, resulting in a near-zero refusal rate with minimal impact on factual accuracy. It is specifically designed for applications requiring an uncensored base model, serving as a frozen backbone for advanced architectures like the Dual-System V2 project.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p