Suchinthana/databricks-dolly-15k-sinhala

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Suchinthana/databricks-dolly-15k-sinhala is a 7 billion parameter Llama 2 model fine-tuned by Suchinthana. This model is specifically optimized for Sinhala language tasks, leveraging a dataset of 3000 data points from the databricks-dolly-15k-sinhala dataset. It is designed for applications requiring natural language understanding and generation in Sinhala.

Loading preview...

Overview

Suchinthana/databricks-dolly-15k-sinhala is a Llama 2-based large language model, featuring 7 billion parameters. It has been fine-tuned by Suchinthana using a specialized dataset to enhance its capabilities in the Sinhala language. The fine-tuning process involved 3000 data points from the databricks-dolly-15k-sinhala dataset, executed over 200 steps, which approximates to 1.01 epochs of training.

Key Capabilities

  • Sinhala Language Processing: Optimized for understanding and generating text in Sinhala.
  • Instruction Following: Benefits from the instruction-tuned nature of the base Dolly-15k dataset, adapted for Sinhala.

Good For

  • Applications requiring natural language understanding in Sinhala.
  • Generating Sinhala text based on prompts or instructions.
  • Research and development in Sinhala NLP.