MLP-KTLim/llama-3-Korean-Bllossom-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 25, 2024License:llama3Architecture:Transformer0.4K Warm

MLP-KTLim/llama-3-Korean-Bllossom-8B is an 8 billion parameter Korean-English bilingual language model developed by MLPLab at Seoultech, Teddysum, and Yonsei University. Based on the Llama 3 architecture, it features significant Korean vocabulary expansion (over 30,000 words) and enhanced Korean context processing, supporting up to 8192 tokens. This model is optimized for Korean language tasks, leveraging extensive Korean pre-training data (250GB) and instruction tuning with culturally relevant data, achieving state-of-the-art scores on the LogicKor Korean benchmark for models under 10B parameters.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p