openlmlab/open-chinese-llama-7b-patch
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 24, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Open-Chinese-LLaMA-7B-Patch by OpenLMLab is a 7 billion parameter LLaMA-based model incrementally pre-trained on Chinese datasets, designed to significantly enhance Chinese language understanding and generation capabilities. This patch-type model must be applied to an original LLaMA-7B base model. It excels in various Chinese downstream tasks and demonstrates improved performance over the original LLaMA in both Chinese and English benchmarks, including code generation.

Loading preview...