Kukedlc/Neural-Cosmic-Boy-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Kukedlc/Neural-Cosmic-Boy-7B-slerp is a 7 billion parameter language model developed by Kukedlc, built upon the Mistral-7B-v0.1 base architecture. This model is a merge of Neural-Cosmic-7B-slerp, NeuralLogic-7B-V, and SuperCombo, utilizing a ties merging method. It demonstrates strong general reasoning capabilities, achieving an average score of 74.08 on the Open LLM Leaderboard, including 70.48 on AI2 Reasoning Challenge and 64.92 on MMLU.
Loading preview...