InnerI/InnerI-AI-sn6-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Feb 17, 2024License:llama2Architecture:Transformer Open Weights Cold

InnerI/InnerI-AI-sn6-7B-slerp is an 8 billion parameter language model created by InnerI, formed by merging tomaszki/nous-thirty and InnerI/A-I-0xtom-7B-slerp using a slerp method. This model leverages a unique layer-wise parameter interpolation strategy to combine the strengths of its base models, offering a balanced performance across general language tasks. With an 8192-token context length, it is suitable for applications requiring robust conversational abilities and text generation.

Loading preview...