Azazelle/Moko-DARE
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 22, 2024License:cc-by-4.0Architecture:Transformer Open Weights Cold

Azazelle/Moko-DARE is a 7 billion parameter language model merge, built upon the Mistral-7B-v0.1 base using the DARE TIES method. This model integrates capabilities from Open-Orca/Mistral-7B-OpenOrca, akjindal53244/Mistral-7B-v0.1-Open-Platypus, and WizardLM/WizardMath-7B-V1.1. It is designed to combine general instruction following with enhanced mathematical reasoning and instruction-tuned performance, offering a balanced solution for diverse NLP tasks within a 4096 token context window.

Loading preview...