FuseAI/FuseChat-7B-VaRM
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 26, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

FuseAI/FuseChat-7B-VaRM is a 7 billion parameter chat language model developed by Fanqi Wan, Ziyi Yang, Longguang Zhong, Xiaojun Quan, Xinting Huang, and Wei Bi from Sun Yat-sen University. It is created through a novel knowledge fusion and merging strategy (VaRM) that integrates the strengths of three diverse chat LLMs: Nous-Hermes-2-Mixtral-8x7B, Nous-Hermes-2-SOLAR-10.7B, and OpenChat-3.5-7B. This model achieves an MT-Bench score of 8.22, outperforming many 7B and 34B models and approaching larger models like Mixtral-8x7B-Instruct, making it suitable for general conversational AI tasks.

Loading preview...