MSL7/INEX16-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 11, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MSL7/INEX16-7b is a 7 billion parameter language model created by MSL7, formed by merging MSL7/INEX12-7b and liminerity/i using the slerp method. This model leverages the combined strengths of its constituent models, offering a generalized language understanding and generation capability. It is designed for tasks requiring robust performance across various natural language processing applications, with a context length of 4096 tokens.

Loading preview...