TigerLLM-1B-it is a 1 billion parameter instruction-tuned causal language model developed by Nishat Raihan and Marcos Zampieri from George Mason University. Part of the TigerLLM family, this model is specifically designed and optimized for the Bangla language, leveraging a 10M-token educational corpus and a 100K native instruction dataset. It aims to address the linguistic disparity in LLM development for low-resource languages, establishing a new baseline for Bangla language modeling.
No reviews yet. Be the first to review!