quantumaikr/KoreanLM
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 3, 2023Architecture:Transformer0.0K Cold

KoreanLM by quantumaikr is a 7 billion parameter language model specifically developed to address the inefficiencies of existing LLMs for the Korean language. It focuses on optimizing tokenization and understanding of Korean grammar, vocabulary, and cultural nuances. The project aims to provide a more accurate and efficient language model for Korean natural language processing tasks, including fine-tuning with enterprise data.

Loading preview...