Suchinthana/Sinhala-Translate-and-Dolly-Llama-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Suchinthana/Sinhala-Translate-and-Dolly-Llama-7b is a 7 billion parameter language model, fine-tuned from the Llama architecture. This model is specifically optimized for Sinhala-English translation tasks, having undergone sequential fine-tuning with Dolly 3k and a 20k Sinhala-English translation dataset in both directions. Its primary strength lies in facilitating translation between Sinhala and English.

Loading preview...