maldv/winter-garden-7b-delta
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 20, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

maldv/winter-garden-7b-delta is a 7 billion parameter experimental language model developed by maldv, built upon the Mistral-7B-v0.1 architecture through an iterative DARE-TIES tree merge of multiple fine-tuned models. Optimized for multi-turn conversational ability, it aims to serve as a robust base for further training in long-form dialogue. It achieves an average score of 64.93 on various benchmarks, including 60.38 on MMLU and 84.37 on HellaSwag.

Loading preview...