nbeerbower/A0l-12B-heretic: A Decensored Language Model
This model, developed by nbeerbower, is a 12 billion parameter variant of the schneewolflabs/A0l-12B model, featuring a substantial 32768 token context length. Its primary distinction lies in its decensored nature, achieved through the application of the Heretic v1.2.0 tool. This process significantly alters its refusal behavior, as evidenced by a reduction from 40/100 refusals in the original model to 10/100 in this version, while maintaining a low KL divergence of 0.0364.
Key Characteristics
- Decensored Output: Engineered to produce less restrictive content, making it suitable for use cases where the original model's refusal rates might be prohibitive.
- Superior Writing Capabilities: Preliminary tests indicate enhanced writing quality compared to its predecessor,
schneewolflabs/A0-12B. - Large Context Window: A 32768 token context length allows for processing and generating longer, more coherent texts.
Ideal Use Cases
- Creative Writing & Story Generation: Benefits from its improved writing capabilities and reduced content restrictions.
- Unfiltered Content Generation: Suitable for applications requiring less moderation or more direct responses.
- Exploratory AI Research: Useful for studying the effects of decensoring techniques on large language models.