monuminu/indo-instruct-llama2-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

The monuminu/indo-instruct-llama2-13b is a 13 billion parameter instruction-tuned causal language model built upon the LLaMA-2 architecture. Developed by monuminu, this model is designed for general English language tasks, leveraging fine-tuning on the Alpaca dataset. It is suitable for various conversational and instructional applications, providing a robust base for further development.

Loading preview...