Markr-AI/Gukbap-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Aug 5, 2024Architecture:Transformer0.0K Cold

Markr-AI/Gukbap-Mistral-7B is a 7 billion parameter Korean language model developed by HumanF-MarkrAI, fine-tuned from Mistral-7B-Instruct-v0.2. It features an 8192-token context length and is notable for being trained exclusively on open-source generated data, avoiding proprietary model data. This model achieves a SOTA score of 6.06 on the LogicKor evaluation for Mistral-based Korean models under 7B parameters, demonstrating strong performance in Korean language tasks.

Loading preview...