Pierre-obi/Mistral_solar-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Pierre-obi/Mistral_solar-slerp is a 7 billion parameter language model created by Pierre-obi, formed by merging NousResearch/Nous-Hermes-2-SOLAR-10.7B and mistralai/Mistral-7B-Instruct-v0.2 using a slerp merge method. This model combines the strengths of both base models, leveraging the instruction-following capabilities of Mistral-7B-Instruct-v0.2 and the advanced reasoning of Nous-Hermes-2-SOLAR-10.7B. It is designed for general-purpose conversational AI and instruction-following tasks, offering a balanced performance profile.

Loading preview...