Ichsan2895/Merak-7B-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Aug 6, 2023License:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Cold

Ichsan2895/Merak-7B-v2 is a 7 billion parameter large language model developed by Muhammad Ichsan, fine-tuned from Meta Llama-2-7B-Chat-HF. Optimized for the Indonesian language, this model leverages QLoRA for efficient operation, requiring approximately 16 GB VRAM. It is specifically trained on Indonesian Wikipedia articles, making it highly proficient in generating and understanding Indonesian text.

Loading preview...