Chamaka8/Serendip-LLM-CPT-SFT-v2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Feb 18, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Chamaka8/Serendip-LLM-CPT-SFT-v2 is an 8.16 billion parameter instruction-following language model built on Meta Llama-3-8B, specifically fine-tuned for the Sinhala language. It excels in Sinhala news classification, question answering, and general text generation, trained on a significantly larger dataset of 309,328 Sinhala examples. This model offers specialized capabilities for Sinhala NLP tasks, particularly in news categorization with 45,080 dedicated examples.

Loading preview...