jsfs11/WONMSeverusDevil-TIES-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The jsfs11/WONMSeverusDevil-TIES-7B is a 7 billion parameter language model, merged from FelixChao/WestSeverus-7B-DPO-v2, jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B, and mlabonne/Daredevil-7B using the TIES-merging method. Built upon the Mistral-7B-v0.1 base model, it features a 4096-token context length and achieves an average Open-LLM benchmark score of 60.91. This model is designed for general language tasks, leveraging the combined strengths of its constituent models.
Loading preview...