yrc696/ETLCH-instruct_based_on_llama3.2-1b_taiwan_traditional_chinese
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Aug 18, 2025License:afl-3.0Architecture:Transformer Warm

The yrc696/ETLCH-instruct_based_on_llama3.2-1b_taiwan_traditional_chinese is a 1 billion parameter instruction-tuned language model, based on the Llama 3.2 architecture. Developed by researchers from National Tsing Hua University, National Yang Ming Chiao Tung University, and University of Taipei, this model is specifically enhanced for stable and improved Traditional Chinese language output. It significantly outperforms the base Llama 3.2-1B-Instruct model in Chinese text generation, making it suitable for research and further fine-tuning in Traditional Chinese NLP applications.

Loading preview...