andrijdavid/tinyllama-dare
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jan 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The andrijdavid/tinyllama-dare is a 1.1 billion parameter language model, created by andrijdavid, formed by merging five distinct TinyLlama-based models. This model is designed for general language tasks, leveraging the combined strengths of its constituent models to offer a versatile, compact solution for various applications with a 2048-token context length.
Loading preview...