The johnsutor/mixture-of-llamas-ties model is an 8 billion parameter instruction-tuned language model created by johnsutor, built upon the Meta-Llama-3-8B-Instruct base. This model was developed using the TIES merge method, combining several specialized Llama-3-8B-Instruct variants. It is designed to leverage the strengths of its constituent models, offering a versatile instruction-following capability with an 8192-token context length.
No reviews yet. Be the first to review!