MatthieuJ/ING_2003M3_SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 20, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MatthieuJ/ING_2003M3_SLERP is a 7 billion parameter language model created by MatthieuJ, formed by merging chihoonlee10/T3Q-DPO-Mistral-7B and MatthieuJ/ING_2003M2_SLERP using the SLERP method. This model leverages the strengths of its constituent models, specifically combining a DPO-tuned Mistral variant with another merged model. It is designed for general language tasks, benefiting from the combined knowledge and fine-tuning of its merged components.

Loading preview...