arcee-ai/Gemma-Zephyr-Dolly-Chat-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:8.5BQuant:FP8Ctx Length:8kPublished:Mar 3, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gemma-Zephyr-Dolly-Chat-Slerp is an 8.5 billion parameter language model created by arcee-ai, formed by merging HuggingFaceH4/zephyr-7b-gemma-v0.1 and google/gemma-7b+philschmid/gemma-7b-dolly-chatml using the slerp merge method. This model leverages the strengths of its constituent Gemma-based models, offering a combined capability for general language tasks. With an 8192 token context length, it is suitable for applications requiring moderate context understanding.

Loading preview...