allura-org/Tlacuilo-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 15, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Tlacuilo-12B is a 12 billion parameter causal language model developed by allura-org, based on Muse-12B. It is specifically fine-tuned for creative writing, excelling in prose generation, roleplay, and adventure scenarios. The model features a 32768-token context length and was trained through a multi-stage process focusing on diverse writing styles and interactive data.

Loading preview...