Suchinthana/databricks-dolly-15k-sinhala
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Suchinthana/databricks-dolly-15k-sinhala is a 7 billion parameter Llama 2 model fine-tuned by Suchinthana. This model is specifically optimized for Sinhala language tasks, leveraging a dataset of 3000 data points from the databricks-dolly-15k-sinhala dataset. It is designed for applications requiring natural language understanding and generation in Sinhala.

Loading preview...