r2rss/Malachite-7b-v0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

r2rss/Malachite-7b-v0 is a 7 billion parameter language model created by r2rss, formed by merging zyh3826/GML-Mistral-merged-v1 and cookinai/CatMacaroni-Slerp using the slerp merge method. This model leverages a unique parameter-specific merging strategy for self-attention and MLP layers, offering a distinct blend of capabilities from its constituent models. It is designed for general language tasks, inheriting the strengths of its Mistral-based components.

Loading preview...