Gille/StrangeMerges_15-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 31, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Gille/StrangeMerges_15-7B-slerp is a 7 billion parameter language model created by Gille, resulting from a slerp merge of Gille/StrangeMerges_14-7B-slerp and CultriX/Wernicke-7B-v9. This model is designed for general text generation tasks, leveraging its merged architecture to achieve a balanced performance across various benchmarks. It features a 4096-token context length and demonstrates an average score of 72.41 on the Open LLM Leaderboard, indicating solid reasoning and language understanding capabilities.

Loading preview...