mesolitica/llama-7b-hf-32768-fpf
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

mesolitica/llama-7b-hf-32768-fpf is a 7 billion parameter Llama 2 model developed by mesolitica, fine-tuned specifically on Malaysian text. This full parameter fine-tuned model features an extended context length of 32,768 tokens, making it suitable for processing extensive documents and conversations in the Malaysian language. It is designed for applications requiring deep understanding and generation of Malaysian-specific content.

Loading preview...