Gille/StrangeMerges_35-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 7, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Gille/StrangeMerges_35-7B-slerp is a 7 billion parameter language model created by Gille through a slerp merge of StrangeMerges_34-7B-slerp and StrangeMerges_32-7B-slerp. This model leverages a specific layer-wise merging strategy to combine the strengths of its constituent models, achieving an average score of 74.75 on the Open LLM Leaderboard. It is designed for general language understanding and generation tasks, demonstrating solid performance across various reasoning and common sense benchmarks.

Loading preview...