liminerity/binarized-ingotrix-slerp-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
liminerity/binarized-ingotrix-slerp-7b is a 7 billion parameter language model created by liminerity, formed by merging eren23/dpo-binarized-NeuralTrix-7B and liminerity/Ingot-7b-slerp-7-forged using a slerp merge method. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.04 on the Open LLM Leaderboard. It is suitable for tasks requiring robust performance across various benchmarks, including common sense reasoning and language understanding.
Loading preview...