Eric111/MarcoHermes
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MarcoHermes is a 7 billion parameter language model created by Eric111, formed by merging AtAndDev/CapybaraMarcoroni-7B and eren23/DistilHermes-2.5-Mistral-7B using the mergekit tool. This model leverages the strengths of its constituent models, offering a 4096-token context length. It is designed to combine the capabilities of its base models, making it suitable for general-purpose language generation tasks.

Loading preview...