THU-KEG/ADELIE-SFT-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Nov 4, 2024License:llama2Architecture:Transformer Open Weights Warm

THU-KEG/ADELIE-SFT-1.5B is a 1.5 billion parameter instruction-tuned language model developed by Yunjia Qi, Hao Peng, Xiaozhi Wang, Bin Xu, Lei Hou, and Juanzi Li, fine-tuned from Qwen2.5-1.5B. It is specifically aligned for various Information Extraction (IE) tasks, including closed, open, and on-demand IE, utilizing the high-quality IEInstruct alignment corpus. The model demonstrates state-of-the-art performance among open-source models on IE benchmarks while maintaining general language capabilities with a 32768 token context length.

Loading preview...