GAI-LLM/ko-en-llama2-13b-mixed-v2
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:cc-by-nc-2.0Architecture:Transformer Open Weights Warm

GAI-LLM/ko-en-llama2-13b-mixed-v2 is a 13 billion parameter auto-regressive language model developed by Donghoon Oh, Hanmin Myung, and Eunyoung Kim (SK C&C G.AI Eng), based on the LLaMA2 transformer architecture. This model is specifically fine-tuned for mixed Korean and English language processing, leveraging a combination of Open Korean Datasets. It is designed to excel in tasks requiring understanding and generation in both Korean and English contexts.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p