MBZUAI/bactrian-x-llama-13b-merged
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jun 19, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold

MBZUAI/bactrian-x-llama-13b-merged is a 13 billion parameter LLaMA-based model developed by MBZUAI, fine-tuned using low-rank adaptation (LoRA). It was trained on a multilingual dataset derived from Stanford-Alpaca-52k and databricks-dolly-15k, translated into 52 languages. This model specializes in multilingual instruction-following, making it suitable for applications requiring understanding and generation across a broad range of languages.

Loading preview...