yentinglin/Taiwan-LLM-7B-v2.0-base
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 6, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The Taiwan-LLM-7B-v2.0-base is a 7 billion parameter GPT-like language model developed by Yen-Ting Lin and Yun-Nung Chen, fine-tuned from Llama-2-7b-hf. This model is specifically tailored for Traditional Chinese, focusing on the linguistic and cultural contexts of Taiwan. It excels in language understanding and generation, demonstrating improved performance on benchmarks like TC-Eval due to its contextual comprehension and cultural relevance.

Loading preview...