lgaalves/tinyllama-1.1b-chat-v0.3_platypus is a 1.1 billion parameter instruction fine-tuned model developed by Luiz G A Alves, based on the TinyLlama transformer architecture. This model is specifically trained on STEM and logic-based datasets, making it suitable for tasks requiring reasoning in these domains. It offers a 2048-token context length and shows competitive performance on benchmarks like MMLU and TruthfulQA compared to its base model.
No reviews yet. Be the first to review!