weiren119/traditional_chinese_qlora_llama2_13b_merged
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The weiren119/traditional_chinese_qlora_llama2_13b_merged model is a 13 billion parameter Llama 2-based language model, fine-tuned by weiren119 using QLoRA. It specializes in understanding and generating Traditional Chinese, specifically optimized for instruction-following tasks. This model leverages a translated Alpaca dataset to enhance its performance in Traditional Chinese contexts, making it suitable for applications requiring robust Traditional Chinese language capabilities.
Loading preview...