Gille/StrangeMerges_8-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Gille/StrangeMerges_8-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging Gille/StrangeMerges_7-7B-slerp and Gille/StrangeMerges_5-7B-ties using the slerp method. This model demonstrates an average performance of 73.39 on the Open LLM Leaderboard, with notable scores in reasoning and common sense benchmarks. It is suitable for general language generation tasks requiring a balance of performance and efficiency within a 4096-token context window.
Loading preview...