jeff31415/TinyLlama-1.1B-1T-OpenOrca
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Oct 9, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

jeff31415/TinyLlama-1.1B-1T-OpenOrca is a 1.1 billion parameter language model based on the TinyLlama architecture, fine-tuned on the OpenOrca GPT-4 subset. This model is optimized for instruction-following tasks, leveraging its training on high-quality conversational data. It offers a compact yet capable solution for applications requiring efficient natural language understanding and generation.

Loading preview...