Gille/StrangeMerges_34-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 7, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_34-7B-slerp is a 7 billion parameter language model created by Gille, formed by an slerp merge of ContextualAI/Contextual_KTO_Mistral_PairRM and Gille/StrangeMerges_30-7B-slerp. This model leverages a specific slerp merging configuration across its 32 layers, with varying 't' parameters for self-attention and MLP blocks, to combine the strengths of its base models. It is designed for general text generation tasks, offering a 4096-token context window.

Loading preview...