Magistry-24B-v1.1: A Creative Merge Model
Magistry-24B-v1.1 is a 24 billion parameter language model developed by sophosympatheia, representing an iteration designed to improve coherency and handling from its v1.0 predecessor without sacrificing its creative and entertaining output. Built using the DELLA merge method, its base model is Darkhn/Magistral-2509-24B-Text-Only, incorporating elements from Casual-Autopsy/Maginum-Cydoms-24B, DarkArtsForge/Magistaroth-24B-v1, and a pre-processed huihui-ai/Huihui-Devstral-Small-2-24B-Instruct-2512-abliterated model.
Key Capabilities & Characteristics
- Enhanced Coherency: Aims to provide better logical and physical continuity compared to v1.0, though it may still struggle with intricate details.
- Creative Output: Designed to be highly creative and entertaining, with specific sampler settings provided to optimize for either accuracy or flair.
- Thinking Tags Support: Can utilize
<think></think> tags for internal monologue and planning, which can assist in improving continuity and complex task handling. - Flexible Sampler Settings: Offers recommended "Conservative," "Wild," and "Balanced" sampler configurations, including Adaptive-P settings, to tailor output for different creative or accuracy-focused use cases.
Ideal Use Cases
- Creative Text Generation: Excels in scenarios requiring imaginative and engaging content, such as role-playing or story generation.
- Interactive Applications: Suitable for applications where entertaining and dynamic responses are prioritized.
- Experimentation with Samplers: Provides detailed guidance and a master import JSON for SillyTavern users to experiment with various sampler settings to fine-tune model behavior.