yentinglin/Taiwan-LLM-7B-v2.0.1-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 10, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Taiwan-LLM-7B-v2.0.1-chat by yentinglin is a 7 billion parameter GPT-like language model specifically fine-tuned for Traditional Chinese, with a focus on the linguistic and cultural contexts of Taiwan. Developed from a large base model and enriched with diverse Taiwanese textual sources, it excels in language understanding and generation. This model demonstrates improved performance on benchmarks like TC-Eval, showcasing its contextual comprehension and cultural relevance for Traditional Chinese applications.

Loading preview...