Ichsan2895/Merak-7B-v3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Cold

Merak-7B-v3 by Ichsan2895 is a 7 billion parameter large language model specifically fine-tuned for the Indonesian language. Based on Meta's Llama-2-7B-Chat-HF architecture, it leverages QLoRA for efficient fine-tuning and can run with 16 GB VRAM. This model is optimized for Indonesian language tasks, having been trained on cleaned Indonesian Wikipedia articles and additional Indonesian datasets.

Loading preview...