Gille/StrangeMerges_16-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 31, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Gille/StrangeMerges_16-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging Gille/StrangeMerges_15-7B-slerp and SanjiWatsuki/Kunoichi-7B using a slerp method. This model features a 4096-token context length and achieves an average score of 72.80 on the Open LLM Leaderboard, demonstrating strong general reasoning and language understanding capabilities. It is suitable for a variety of general-purpose natural language processing tasks.

Loading preview...