DuoNeural/Archon-8B

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 23, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

DuoNeural/Archon-8B is an 8 billion parameter language model based on Alibaba's Qwen3-8B architecture, featuring a 32768 token context length. Developed by Archon, this model has undergone SVD-based refusal direction abliteration to remove safety filters, while retaining its advanced reasoning, code, math, and multilingual capabilities. It is designed for research, security work, creative writing, and unrestricted use cases where the base model's safety conditioning is undesirable.

Loading preview...

Archon-8B: Unrestricted Reasoning with Qwen3-8B

Archon-8B is an 8 billion parameter model developed by Archon (DuoNeural), built upon Alibaba's Qwen3-8B base. Its primary distinction is the abliteration of safety filters using an SVD-based refusal direction projection method, as described by Arditi et al. (2024). This process involved modifying 147 weight matrices across layers 7-27 to remove refusal conditioning while preserving the model's core reasoning capabilities.

Key Capabilities

  • Unrestricted Reasoning: Retains Qwen3-8B's advanced reasoning, code, math, and multilingual abilities without safety filters.
  • Thinking Mode: Features an intact <think> block mechanism, allowing the model to reason internally before generating responses.
  • Efficient Inference: Requires a minimum of 16GB VRAM for BF16 inference, or approximately 5GB VRAM when loaded in 4-bit.

Good For

  • Research & Security: Ideal for exploring model behavior without safety constraints and for security-related applications.
  • Creative Writing & Roleplay: Provides an unrestricted environment for generating diverse and imaginative content.
  • Uncensored Applications: Suitable for use cases where the base model's refusal conditioning hinders desired outputs.

This model is released for research and unrestricted use, emphasizing the user's responsibility for its deployment.