MBZUAI/bactrian-x-llama-7b-merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:mitArchitecture:Transformer0.0K Open Weights Cold

MBZUAI/bactrian-x-llama-7b-merged is a 7 billion parameter LLaMA-based model fine-tuned using low-rank adaptation (LoRA). It is trained on a multilingual instruction dataset derived from Stanford-Alpaca-52k and Databricks-Dolly-15k, translated into 52 languages. This model specializes in multilingual instruction-following, making it suitable for applications requiring understanding and generation across diverse languages.

Loading preview...