yentinglin/Taiwan-LLaMa-v0.9
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Taiwan-LLaMa-v0.9 by yentinglin is a 13 billion parameter GPT-like language model, fine-tuned for Traditional Chinese with a specific focus on Taiwanese linguistic and cultural contexts. Developed from a large base model and enriched with diverse Taiwanese textual sources, it excels in language understanding and generation, demonstrating improved performance on benchmarks like TC-Eval. This model is primarily intended for applications requiring deep contextual comprehension and cultural relevance within the Traditional Chinese-speaking Taiwanese community.

Loading preview...