yentinglin/Taiwan-LLaMa-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Aug 10, 2023License:llama2Architecture:Transformer0.1K Open Weights Cold
Taiwan-LLaMa-v1.0 by yentinglin is a 13 billion parameter GPT-like language model specifically fine-tuned for Traditional Chinese, focusing on the linguistic and cultural contexts of Taiwan. Developed from a large base model and refined through Supervised Fine-Tuning with diverse Taiwanese textual sources, it excels in language understanding and generation. This model demonstrates improved performance on benchmarks like TC-Eval, showcasing its contextual comprehension and cultural relevance for applications requiring deep understanding of Taiwanese Traditional Chinese.
Loading preview...