ayousanz/llama-ca-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:llama2Architecture:Transformer Open Weights Cold

The ayousanz/llama-ca-7B-slerp is a 7 billion parameter language model created by ayousanz, resulting from a slerp merge of Meta's Llama-2-7b-chat-hf and CyberAgent's calm2-7b models. This merged model leverages the strengths of both base models, offering a balanced performance profile for general conversational AI tasks. Its architecture is designed to combine the robust capabilities of Llama 2 with the specific characteristics of calm2-7b, making it suitable for applications requiring a blend of general knowledge and potentially specialized Japanese language understanding.

Loading preview...