MSL7/INEX4-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MSL7/INEX4-7b is a 7 billion parameter language model developed by Liminerity, created through a series of slerp merges using MergeKit. This model integrates components from liminerity/Ingot-7b-slerp-7-forged and yam-peleg/Experiment26-7B, resulting in a model with a 4096 token context length. It demonstrates strong general reasoning capabilities, achieving an average score of 75.84 on the Open LLM Leaderboard, making it suitable for diverse natural language processing tasks.

Loading preview...