MatthieuJ/ING_Triomphant_M2_SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 24, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MatthieuJ/ING_Triomphant_M2_SLERP is a 7 billion parameter language model created by MatthieuJ, formed by merging arcee-ai/Clown-DPO-Extended and MatthieuJ/ING_Triomphant_M1_SLERP using the SLERP method. This model leverages a 4096 token context length and is designed to combine the strengths of its constituent models for general language tasks.

Loading preview...