Naphula/Muse-Mell-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jan 14, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Naphula/Muse-Mell-12B is a 12 billion parameter language model, a slerp (spherical linear interpolation) merge of MagMell with Muse. This model is primarily characterized by its experimental merge approach, aiming to combine the strengths of its constituent models. It offers a unique blend of capabilities derived from its merged architecture, suitable for general language generation tasks.

Loading preview...