Gille/StrangeMerges_11-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 30, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_11-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging Gille/StrangeMerges_10-7B-slerp and mlabonne/NeuralBeagle14-7B using a slerp method. This model features a 4096-token context length and demonstrates strong general reasoning capabilities, achieving an average score of 74.80 on the Open LLM Leaderboard. It is well-suited for a variety of general-purpose language generation and understanding tasks.

Loading preview...