MLP-KTLim/llama-3-Korean-Bllossom-8B is an 8 billion parameter Korean-English bilingual language model developed by MLPLab at Seoultech, Teddysum, and Yonsei University. Based on the Llama 3 architecture, it features significant Korean vocabulary expansion (over 30,000 words) and enhanced Korean context processing, supporting up to 8192 tokens. This model is optimized for Korean language tasks, leveraging extensive Korean pre-training data (250GB) and instruction tuning with culturally relevant data, achieving state-of-the-art scores on the LogicKor Korean benchmark for models under 10B parameters.
No reviews yet. Be the first to review!