Gille/StrangeMerges_12-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 30, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Gille/StrangeMerges_12-7B-slerp is a 7 billion parameter language model created by Gille, built by merging Keynote-Technology/KAI-7B-v0.1 and Gille/StrangeMerges_11-7B-slerp using a slerp merge method. This model achieves an average score of 69.13 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. With a 4096-token context length, it is suitable for general-purpose text generation and conversational AI applications.
Loading preview...