yentinglin/Taiwan-LLM-13B-v2.0-base
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The yentinglin/Taiwan-LLM-13B-v2.0-base is a 13 billion parameter GPT-like language model developed by Yen-Ting Lin and Yun-Nung Chen, fine-tuned from meta-llama/Llama-2-13b-hf. This model is specifically tailored for Traditional Chinese, focusing on the linguistic and cultural contexts of Taiwan, and is enriched with diverse Taiwanese textual sources. It excels in language understanding and generation, aligning closely with Taiwan's cultural nuances, and is intended for fine-tuning into instruction-following or chat applications.

Loading preview...