shleeeee/mistral-ko-exo-mrc-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kLicense:otherArchitecture:Transformer Cold

The shleeeee/mistral-ko-exo-mrc-v1 model is a fine-tuned variant of the Mistral-7B architecture, developed by shleeeee (Seunghyeon Lee) and oopsung (Sungwoo Park). This model is specifically optimized for Korean language processing. Its primary strength lies in Korean language understanding and generation tasks, making it suitable for applications requiring robust Korean NLP capabilities.

Loading preview...