Azazelle/Mocha-Dare-7b-ex
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 23, 2024License:cc-by-4.0Architecture:Transformer Open Weights Cold

Azazelle/Mocha-Dare-7b-ex is a 7 billion parameter language model based on the Mistral-7B-v0.1 architecture, created by Azazelle through a DARE TIES merge. This model integrates capabilities from Open-Orca/Mistral-7B-OpenOrca, akjindal53244/Mistral-7B-v0.1-Open-Platypus, and WizardLM/WizardMath-7B-V1.1. It is designed to combine general instruction following with enhanced mathematical reasoning and conversational abilities.

Loading preview...