beezu/Magistry-24B-v1.1-mlx-bf16
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Magistry-24B-v1.1-mlx-bf16 is a 24 billion parameter language model, converted to MLX format from sophosympatheia/Magistry-24B-v1.1, which is a merge of several pre-trained models. Developed by sophosympatheia, this model is designed for creative tasks, aiming for enhanced coherency and entertaining output. It offers a 32768 token context length and is optimized for creative generation and conversational scenarios, with specific sampler settings provided for balancing flair and accuracy.

Loading preview...