yentinglin/Taiwan-LLM-7B-v2.1-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 12, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Taiwan LLM 7B v2.1 chat is a 7 billion parameter GPT-like language model developed by Yen-Ting Lin and Yun-Nung Chen, fine-tuned for Traditional Chinese. It is enriched with diverse Taiwanese textual sources and refined through Supervised Fine-Tuning, excelling in language understanding and generation aligned with Taiwan's cultural nuances. This model demonstrates improved performance on benchmarks like TC-Eval, showcasing its contextual comprehension and cultural relevance, making it suitable for applications requiring deep understanding of Traditional Chinese.
Loading preview...