FuseAI/OpenChat-3.5-7B-Mixtral
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

FuseAI/OpenChat-3.5-7B-Mixtral is a 7 billion parameter chat language model developed by FuseAI, resulting from a pairwise knowledge fusion between OpenChat-3.5-7B and Nous-Hermes-2-Mixtral-8x7B-DPO. This model is a component of the broader FuseChat framework, designed to integrate the strengths of multiple LLMs into a single, memory-efficient model. It achieves strong performance on the MT-Bench benchmark, making it suitable for general conversational AI tasks.

Loading preview...