CultriX/MergeTrix-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 15, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

CultriX/MergeTrix-7B is a 7 billion parameter language model created by CultriX, formed by merging udkai/Turdus with abideen/NexoNimbus-7B, fblgit/UNA-TheBeagle-7b-v1, and argilla/distilabeled-Marcoro14-7B-slerp using a DARE TIES merge method. This model, with a 4096 token context length, offers impressive overall performance, despite a noted subtle DPO-contamination from its base model that slightly influences Winogrande scores while boosting other benchmarks. It is suitable for general language generation tasks where a balanced performance across various metrics is desired.

Loading preview...