yentinglin/Taiwan-LLM-7B-v2.0-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 9, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The Taiwan-LLM-7B-v2.0-chat model by yentinglin is a 7 billion parameter GPT-like language model, fine-tuned for Traditional Chinese with a focus on Taiwanese linguistic and cultural contexts. Developed from a large base model and enriched with diverse Taiwanese textual sources, it excels in language understanding and generation, demonstrating improved performance on benchmarks like TC-Eval. This model is specifically designed to align with Taiwan's cultural nuances and is suitable for applications requiring culturally relevant Traditional Chinese processing.

Loading preview...