FredrikBL/HermesFlashback-7B.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

HermesFlashback-7B.1 is a 7 billion parameter language model created by FredrikBL, formed by merging mlabonne/NeuralHermes-2.5-Mistral-7B and timpal0l/Mistral-7B-v0.1-flashback-v2. This model leverages the Mistral architecture and is designed for general text generation tasks, combining the strengths of its constituent models. It processes inputs with a context length of 4096 tokens, making it suitable for a variety of conversational and content creation applications.

Loading preview...