sapinsapin/llama31-8b-balitanlp-cpt
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:llama3.1Architecture:Transformer Cold

The sapinsapin/llama31-8b-balitanlp-cpt is an 8 billion parameter Llama-3.1 base model continuously pretrained by sapinsapin. It was fine-tuned on Filipino news articles from the BalitaNLP dataset, enhancing its understanding and generation capabilities for the Filipino language. This model is intended as a foundational base for further instruction tuning or applications requiring Filipino language proficiency.

Loading preview...