Local-Axiom-AI/Chan-0.6B
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 31, 2026License:mitArchitecture:Transformer Open Weights Cold

Chan-0.6B by Local-Axiom-AI is a 600 million parameter Transformer language model, based on Qwen-3-0.6B, specifically fine-tuned on 200 million tokens of 4Chan post data. This model specializes in generating informal internet dialogue, making it suitable for prototyping conversational agents and academic research into noisy dialogue data. It excels at replicating 4Chan-style language and is intended for controlled experimental settings.

Loading preview...