AIdenU/LLAMA-2-13b-ko-Y24_v2.0
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Warm
AIdenU/LLAMA-2-13b-ko-Y24_v2.0 is a 13 billion parameter language model developed by AIdenU, based on the Llama-2 architecture. This model is specifically fine-tuned for Korean language processing, offering a context length of 4096 tokens. It is designed to provide robust performance for applications requiring understanding and generation of Korean text.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p