Ichsan2895/Merak-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Cold

Ichsan2895/Merak-7B-v1 is a 7 billion parameter large language model developed by Ichsan2895, fine-tuned from Meta's Llama-2-7B-Chat-HF. This model is specifically optimized for the Indonesian language, leveraging fine-tuning on Indonesian Wikipedia articles. It supports a 4096-token context length and can run with 16 GB VRAM using QLoRA quantization, making it suitable for Indonesian language processing tasks.

Loading preview...