mohomin123/M-DIE-M-10.7B
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-sa-4.0Architecture:Transformer Open Weights Warm

M-DIE-M-10.7B is a 10.7 billion parameter instruction-tuned causal language model developed by Ados, based on Upstage's SOLAR-10.7B-Instruct-v1.0 architecture. This model is specifically optimized for Korean language tasks, with its training dataset comprising 73% Korean data. It excels in various conversational formats including single-turn QA, multi-turn QA, and summarization, making it suitable for Korean-centric AI assistant applications.

Loading preview...