Changgil/k2s3_test_24001
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Feb 14, 2024License:llama2Architecture:Transformer Open Weights Warm
Changgil/k2s3_test_24001 is a 13 billion parameter language model developed by Changgil Song, fine-tuned from Meta's Llama-2-13b-chat-hf. It was trained on approximately 800 million tokens, including the Standard Korean Dictionary, KULLM data, dissertation abstracts, and AI Hub Korean language samples. This model is specifically optimized for Korean language understanding and generation, leveraging PEFT LoRA techniques for efficient fine-tuning. Its primary strength lies in processing and generating content in Korean, making it suitable for applications requiring robust Korean language capabilities.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p