arcee-ai/gemma-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:8.5BQuant:FP8Ctx Length:8kPublished:Feb 27, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
arcee-ai/gemma-7b-slerp is an 8.5 billion parameter language model created by arcee-ai, formed by merging Google's Gemma 7B base and 7B-Instruct models using the Slerp method. This model leverages the strengths of both base and instruction-tuned Gemma variants, offering a balanced performance profile. It is suitable for general-purpose language tasks, including instruction following and text generation, with a context length of 8192 tokens.
Loading preview...