arcee-ai/Llama-3-8B-Instruct-Base-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 18, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

arcee-ai/Llama-3-8B-Instruct-Base-Slerp is an 8 billion parameter language model developed by arcee-ai, created by merging Meta-Llama-3-8B-Instruct and Meta-Llama-3-8B using a slerp merge method. This model combines the instruction-following capabilities of the instruct variant with the base model's foundational knowledge. It is designed for general-purpose language tasks, leveraging the strengths of both merged Llama 3 components.

Loading preview...