LatitudeGames/Muse-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:May 7, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

LatitudeGames/Muse-12B is a 12 billion parameter language model built on the Mistral Nemo architecture, fine-tuned through supervised fine-tuning (SFT) and two stages of Direct Preference Optimization (DPO). It excels at generating long, emotionally rich narratives with strong character relationships and maintains narrative coherence over its 32768-token context length. Muse-12B is particularly optimized for creative writing, roleplay, and text adventure scenarios, delivering natural expression and emotional depth.

Loading preview...