MRAIRR/mini_7B_dare_v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 30, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MRAIRR/mini_7B_dare_v1 is a 7 billion parameter language model merged by MRAIRR, based on mistralai/Mistral-7B-v0.1 with a 4096 token context length. This model was created using the DARE TIES merge method, combining OpenBuddy/openbuddy-mistral-7b-v13.1, MRAIRR/hubsalmon_tra, and EmbeddedLLM/Mistral-7B-Merge-14-v0.3. It is designed to leverage the strengths of its constituent models through a specific merging technique.

Loading preview...