yentinglin/Taiwan-LLaMa-v0.0
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Cold

Taiwan-LLaMa-v0.0 by yentinglin is a 13 billion parameter GPT-like language model, fine-tuned from Llama-2-13b-chat-hf, specifically designed for Traditional Chinese. It is enriched with diverse Taiwanese textual sources and refined through Supervised Fine-Tuning, excelling in language understanding and generation with strong cultural relevance. The model demonstrates improved performance on benchmarks like TC-Eval, making it suitable for applications requiring nuanced comprehension of Taiwanese linguistic and cultural contexts.

Loading preview...