Weyaxi/OpenOrca-Zephyr-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 11, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
Weyaxi/OpenOrca-Zephyr-7B is a 7 billion parameter language model created by Weyaxi, formed by merging HuggingFaceH4/zephyr-7b-alpha and Open-Orca/Mistral-7B-OpenOrca using a TIES merge. This model combines the strengths of both base models, offering a balanced performance profile for general language understanding and generation tasks. It is suitable for applications requiring a capable 7B model with a 4096-token context length.
Loading preview...