Gille/StrangeMerges_20-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Gille/StrangeMerges_20-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging flemmingmiguel/MBX-7B-v3 and Gille/StrangeMerges_11-7B-slerp using the slerp method. This model achieves an average score of 75.52 on the Open LLM Leaderboard, demonstrating strong general reasoning and language understanding capabilities across various benchmarks. With a 4096-token context length, it is suitable for a wide range of general-purpose natural language processing tasks.

Loading preview...