The andrijdavid/tinyllama-dare is a 1.1 billion parameter language model, created by andrijdavid, formed by merging five distinct TinyLlama-based models. This model is designed for general language tasks, leveraging the combined strengths of its constituent models to offer a versatile, compact solution for various applications with a 2048-token context length.
No reviews yet. Be the first to review!