mesolitica/malaysian-llama2-13b-32k-instructions
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold
The mesolitica/malaysian-llama2-13b-32k-instructions model is a 13 billion parameter Llama2-based instruction-tuned language model developed by mesolitica. It is fine-tuned using QLORA on a translated UltraChat dataset, specifically designed for chat completions in Malaysian. This model leverages a 32k context length and the Llama2 chat template, making it suitable for conversational AI applications requiring Malaysian language understanding and generation.
Loading preview...