Overview
MatthieuJ/ING_Triomphant_M2_SLERP is a 7 billion parameter language model developed by MatthieuJ. It is a product of merging two distinct models: arcee-ai/Clown-DPO-Extended and MatthieuJ/ING_Triomphant_M1_SLERP. This merge was performed using the SLERP (Spherical Linear Interpolation) method, a technique often employed to combine the weights of different models to achieve a blended performance profile.
Key Characteristics
- Architecture: A merged model combining
arcee-ai/Clown-DPO-Extended and MatthieuJ/ING_Triomphant_M1_SLERP. - Parameter Count: 7 billion parameters.
- Context Length: Supports a context window of 4096 tokens.
- Merging Method: Utilizes
mergekit with the SLERP method, specifically applying different interpolation values (t) to self-attention and MLP layers to fine-tune the merge outcome.
Intended Use
This model is suitable for general language generation and understanding tasks, aiming to benefit from the combined capabilities of its base models. The specific configuration of the SLERP merge suggests an attempt to balance or enhance certain aspects of the merged models' performance.