SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 23, 2023License:otherArchitecture:Transformer Cold

SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me is a 7 billion parameter experimental language model based on Mistral-7B-v0.1, developed by SanjiWatsuki. This model explores a novel merging strategy combining DARE TIE and task arithmetic to transfer skills from WizardMath-7B-V1.1 and Intel/neural-chat-7b-v3-3. It is an experimental merger focused on investigating advanced model merging techniques rather than direct application.

Loading preview...