yentinglin/Taiwan-LLM-13B-v2.0-chat
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Oct 17, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

Taiwan-LLM-13B-v2.0-chat is a 13 billion parameter GPT-like language model developed by Yen-Ting Lin and Yun-Nung Chen, fine-tuned for Traditional Chinese. It is enriched with diverse Taiwanese textual sources and refined through Supervised Fine-Tuning, excelling in language understanding and generation aligned with Taiwan's cultural nuances. This model demonstrates improved performance on benchmarks like TC-Eval, showcasing its contextual comprehension and cultural relevance.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p