Praneeth/StarMix-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Praneeth/StarMix-7B-slerp is a 7 billion parameter language model created by Praneeth, formed by merging Starling-LM-7B-alpha and Mistral-7B-Instruct-v0.2 using the slerp method. This model leverages the strengths of its base components to offer enhanced performance across various benchmarks. It is suitable for general-purpose conversational AI and instruction-following tasks, with a context length of 4096 tokens.

Loading preview...