BanglaLLM/BanglaLLama-3-8b-bangla-alpaca-orca-instruct-v0.0.1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:llama3Architecture:Transformer0.0K Cold

BanglaLLM/BanglaLLama-3-8b-bangla-alpaca-orca-instruct-v0.0.1 is an 8 billion parameter instruction-tuned causal language model developed by BanglaLLM. Pre-trained on the unolp/culturax dataset and then fine-tuned with BanglaLLM/bangla-alpaca-orca, this model is designed for advancing LLMs for the Bangla language. It supports both Bangla and English, making it suitable for immediate inference in bilingual applications.

Loading preview...