TinyLlama-1.1B-Chat-v0.1 is a 1.1 billion parameter Llama-architecture model developed by the TinyLlama project, pretrained on 3 trillion tokens. This compact model is fine-tuned for chat applications, leveraging the same architecture and tokenizer as Llama 2 for broad compatibility. Its small size and chat optimization make it suitable for applications requiring restricted computation and memory footprints.
No reviews yet. Be the first to review!