sayururehan/sinhala-qwen3-4b-lora
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Warm

The sinhala-qwen3-4b-lora model by sayururehan is a 4 billion parameter language model based on the Qwen3 architecture, featuring a 32K context length. This model is a LoRA-merged output, specifically fine-tuned for Sinhala language tasks. It is designed for applications requiring robust natural language processing capabilities in Sinhala.

Loading preview...