damerajee/Oot-v2_lll
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

damerajee/Oot-v2_lll is a 7 billion parameter language model created by damerajee, formed by merging mlabonne/Marcoro14-7B-slerp and Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp using a slerp merge method. This model is designed for general language understanding and generation tasks, leveraging the combined strengths of its constituent models. It features a 4096-token context length and achieves an average score of 72.73 on the Open LLM Leaderboard, indicating strong performance across various benchmarks including reasoning, common sense, and MMLU.

Loading preview...