Gille/StrangeMerges_24-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 18, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_24-7B-slerp is a 7 billion parameter language model created by Gille, developed through a slerp merge of StrangeMerges_21-7B-slerp and bardsai/jaskier-7b-dpo-v5.6. This model leverages a specific layer-wise parameter interpolation to combine the strengths of its base models, achieving an average score of 76.21 on the Open LLM Leaderboard. It is designed for general language generation tasks, demonstrating balanced performance across various benchmarks including reasoning, common sense, and question answering.

Loading preview...