cris177/Orca-Hermes-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

cris177/Orca-Hermes-7B-slerp is a 7 billion parameter language model created by cris177, merged from Open-Orca/Mistral-7B-OpenOrca and teknium/OpenHermes-2.5-Mistral-7B using the slerp method. This model leverages the strengths of both base models, offering a balanced performance profile for general-purpose language tasks. With a 4096-token context length, it is suitable for applications requiring moderate input and output lengths.

Loading preview...