ICTNLP/bayling-13b-diff
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jun 14, 2023License:gpl-3.0Architecture:Transformer0.0K Open Weights Cold

ICTNLP/bayling-13b-diff is a 13 billion parameter instruction-following large language model developed by the NLP Group of the Institute of Computing Technology, Chinese Academy of Sciences (ICT/CAS). This model is a weight-diff version of BayLing-13B-v1.0, designed for advanced language alignment and superior English/Chinese generation, instruction following, and multi-turn interaction. It is optimized for deployment on consumer-grade GPUs with 16GB of memory, assisting with tasks like translation, writing, and creation.

Loading preview...