mesolitica/malaysian-mistral-7b-32k-instructions
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 3, 2023Architecture:Transformer0.0K Cold

The mesolitica/malaysian-mistral-7b-32k-instructions model is a 7 billion parameter Mistral-based language model developed by Mesolitica, fine-tuned for Malaysian instruction following. It features a 32k context length and is specifically optimized for understanding and generating responses in the Malaysian context. This model excels at processing and responding to instructions relevant to Malaysian language and culture, including function calling capabilities.

Loading preview...