arcee-ai/Llama-3-Base-Instruct-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The arcee-ai/Llama-3-Base-Instruct-Slerp is an 8 billion parameter language model created by arcee-ai, formed by merging Meta-Llama-3-8B and Meta-Llama-3-8B-Instruct using a slerp method. This model combines the base Llama 3 capabilities with instruction-following fine-tuning, offering a balanced performance for general conversational AI tasks. It leverages a context length of 8192 tokens, making it suitable for applications requiring moderate context understanding and generation.

Loading preview...