MSL7/INEX8-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

INEX8-7B is a 7 billion parameter language model developed by liminerity, created through a series of slerp merges of several 7B models including MSL7/INEX4-7b and yam-peleg/Experiment26-7B. This model is designed for general-purpose language tasks, leveraging its merged architecture to achieve a balanced performance across various benchmarks. With a 4096-token context length, it offers a solid foundation for applications requiring robust language understanding and generation.

Loading preview...