h34v7/DXP-Zero-V1.2-24b-Small-Instruct
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold

DXP-Zero-V1.2-24b-Small-Instruct is a 24 billion parameter instruction-tuned language model created by h34v7, merged using the DARE TIES method with ZeroAgency/Zero-Mistral-24B as its base. This model is specifically optimized for nuanced storytelling and roleplay, particularly excelling in scenarios requiring creative narrative generation and character interaction with a balanced approach to profanity. Its primary strength lies in generating engaging and detailed story outputs, as demonstrated by its ability to handle complex narrative prompts.

Loading preview...