BanglaLLM/BanglaLLama-3.2-3b-bangla-alpaca-orca-instruct-v0.0.1
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Oct 14, 2024License:llama3.2Architecture:Transformer0.0K Warm

The BanglaLLM/BanglaLLama-3.2-3b-bangla-alpaca-orca-instruct-v0.0.1 is a 3.2 billion parameter instruction-tuned causal language model developed by Abdullah Khan Zehady. Pre-trained on the bangla-alpaca-orca dataset, this model is designed for Causal Language Modeling in both Bangla and English. It serves as an important advancement for LLMs in the Bangla language, ready for immediate inference.

Loading preview...