dozzke/hermorca
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 21, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Dozzke/hermorca is a 7 billion parameter language model created by dozzke, merged from NousResearch/Hermes-2-Pro-Mistral-7B and Open-Orca/Mistral-7B-OpenOrca using the SLERP method. This model combines the strengths of its base components, offering a versatile solution for general language understanding and generation tasks. It leverages a 4096-token context length, making it suitable for processing moderately sized inputs.

Loading preview...