nfaheem/Marcoroni-7b-DPO-Merge
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 15, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

nfaheem/Marcoroni-7b-DPO-Merge is a 7 billion parameter language model created by nfaheem, formed by merging fblgit/UNA-TheBeagle-7b-v1 and udkai/Turdus with madatnlp/marcoroni-7b-v3-safetensor as the base model. This merge utilizes the TIES method and achieves a notable average score of 74.9 across various benchmarks including ARC, HellaSwag, MMLU, TruthfulQA, Winogrande, and GSM8K. It is designed for general text generation tasks, demonstrating strong performance in reasoning and common sense understanding.

Loading preview...