Cartinoe5930/DARE-Merging
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

DARE-Merging by Cartinoe5930 is a 7 billion parameter language model with a 4096 token context length, created by merging mistralai/Mistral-7B-Instruct-v0.2 with openchat/openchat-3.5-0106, Open-Orca/Mistral-7B-OpenOrca, and WizardLM/WizardMath-7B-V1.1 using the DARE TIES method. This model is designed to combine the strengths of its constituent models, particularly in instruction following, general chat, and mathematical reasoning.

Loading preview...