kyujinpy/KoT-platypus2-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Warm

The kyujinpy/KoT-platypus2-13B is a 13 billion parameter auto-regressive language model developed by Kyujin Han, based on the LLaMA2 transformer architecture. Fine-tuned from KO-Platypus2-13B, it integrates Chain-of-Thought (CoT) reasoning using the KoCoT_2000 dataset, which is a Korean translation of the kaist-CoT dataset. This model is specifically optimized for Korean language tasks, demonstrating improved performance on the Open KO-LLM LeaderBoard benchmarks compared to its base model.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p