liminerity/Neurotic-Jomainotrik-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
liminerity/Neurotic-Jomainotrik-7b-slerp is a 7 billion parameter language model created by liminerity, formed by merging liminerity/merge and bardsai/jaskier-7b-dpo-v5.6 using the slerp method. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.40 on the Open LLM Leaderboard. It is particularly well-suited for tasks requiring robust understanding and generation, with notable performance in common sense reasoning and question answering.
Loading preview...