louisgrc/Marengoli_7B_SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 24, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Marengoli_7B_SLERP is a 7 billion parameter language model created by louisgrc, formed by merging Rivoli_7B_SLERP and Marengo_7B_SLERP using a spherical linear interpolation (SLERP) method. This model leverages the combined strengths of its constituent models, offering a balanced performance profile for general language generation tasks. It is designed for applications requiring a compact yet capable model with a 4096-token context length.

Loading preview...