jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Oct 24, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha is a 1.1 billion parameter language model, fine-tuned by jeff31415, based on the TinyLlama architecture. It was fine-tuned on the OpenOrca GPT4 subset for one epoch, making it suitable for instruction-following tasks. This model leverages an early version of TinyLlama-1.5T, demonstrating continued performance despite a known dataset processing bug in its base model.
Loading preview...