MSL7/INEX12-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

INEX12-7b is a 7 billion parameter language model created by MSL7, formed by merging liminerity/merge2 and yam-peleg/Experiment26-7B using mergekit. This model leverages a slerp merge method across its constituent models, offering a unique blend of their capabilities. It is designed for general language tasks, providing a balanced performance profile derived from its merged architecture.

Loading preview...