Gille/StrangeMerges_23-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 13, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

StrangeMerges_23-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging paulml/OGNO-7B and Gille/StrangeMerges_21-7B-slerp using the slerp method. This model leverages a 4096-token context length and achieves an average score of 76.17 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. It is suitable for general-purpose text generation and conversational AI applications.

Loading preview...