MaziyarPanahi/Mistral-ko-7B-v0.1-Mistral-7B-Instruct-v0.2-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MaziyarPanahi/Mistral-ko-7B-v0.1-Mistral-7B-Instruct-v0.2-slerp is a 7 billion parameter language model created by MaziyarPanahi, formed by merging mistralai/Mistral-7B-Instruct-v0.2 and maywell/Mistral-ko-7B-v0.1 using the slerp method. This model combines the instruction-following capabilities of Mistral-7B-Instruct-v0.2 with the Korean language proficiency of Mistral-ko-7B-v0.1. It is designed for applications requiring a balance of general instruction adherence and strong Korean language understanding, operating with a 4096-token context length.

Loading preview...