Nitral-AI/Irixxed-Magcap-12B-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jun 8, 2025Architecture:Transformer0.0K Warm

Irixxed-Magcap-12B-Slerp is a 12 billion parameter language model developed by Nitral-AI, created through a Slerp merge of Violet_Magcap-12B and Irix-12B-Model_Stock. This model is specifically crafted for sharp reasoning and solid performance, aiming for synergy between its base components. It is designed to provide robust language capabilities with a focus on clear and effective reasoning.

Loading preview...