Gille/StrangeMerges_36-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 8, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

StrangeMerges_36-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging ammarali32/multi_verse_model and Gille/StrangeMerges_35-7B-slerp using the slerp method. This model leverages a specific layer-wise merging configuration to combine the strengths of its constituent models. It is designed for general text generation tasks, offering a balanced performance derived from its merged architecture.

Loading preview...