Gille/StrangeMerges_32-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 6, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Gille/StrangeMerges_32-7B-slerp is a 7 billion parameter language model created by Gille, resulting from a spherical linear interpolation (slerp) merge of Gille/StrangeMerges_31-7B-slerp and yam-peleg/Experiment28-7B. This model leverages the strengths of its constituent models through a specific layer-wise parameter weighting, aiming for a balanced performance across various tasks. It is designed for general-purpose text generation and understanding, suitable for applications requiring a compact yet capable model.

Loading preview...