FuseAI/FuseChat-7B-v2.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Aug 13, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

FuseAI/FuseChat-7B-v2.0 is a 7 billion parameter chat language model developed by Fanqi Wan, Longguang Zhong, Ziyi Yang, Ruijun Chen, and Xiaojun Quan from Sun Yat-sen University. This model integrates the collective knowledge of six diverse chat LLMs (OpenChat-3.5-7B, Starling-LM-7B-alpha, NH2-Solar-10.7B, InternLM2-Chat-20B, Mixtral-8x7B-Instruct, and Qwen1.5-Chat-72B) into a single, memory-efficient model. It achieves an average performance of 7.38 on MT-Bench, comparable to Mixtral-8x7B-Instruct and approaching GPT-3.5-Turbo-1106, making it suitable for instruction-following and multi-turn conversations.

Loading preview...