flemmingmiguel/HermesChat-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

HermesChat-Mistral-7B by flemmingmiguel is a 7 billion parameter language model merged from openchat/openchat-3.5-1210 and teknium/OpenHermes-2.5-Mistral-7B, built upon the Mistral-7B-v0.1 architecture. This model leverages a slerp merge method to combine the strengths of its constituent models, offering a balanced performance for general conversational AI tasks. It is designed for applications requiring a capable 7B model with a 4096-token context length.

Loading preview...