TinyLlama-1.1B-Chat-v0.3 is a 1.1 billion parameter Llama-architecture language model developed by the TinyLlama project, pretrained on 3 trillion tokens. This chat-finetuned version is designed for conversational applications, offering a compact solution for environments with restricted computational and memory resources. It adopts the same architecture and tokenizer as Llama 2, ensuring compatibility with existing Llama-based open-source projects.
No reviews yet. Be the first to review!