maywell/koOpenChat-sft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 14, 2023License:cc-by-sa-4.0Architecture:Transformer0.0K Open Weights Cold

koOpenChat-sft is a 7 billion parameter instruction-tuned causal language model developed by maywell, based on the OpenChat3.5 architecture. This model is specifically fine-tuned for Korean language tasks, leveraging the ChatML and Alpaca (No-Input) instruction formats. It is designed for general-purpose conversational AI in Korean, offering a balance of performance and efficiency for various applications.

Loading preview...