AdrianFernandes/Qwen2.5-3B-Konkani
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Loading
AdrianFernandes/Qwen2.5-3B-Konkani is a 3.1 billion parameter Qwen2.5 model developed by AdrianFernandes, fine-tuned for the Konkani language. This model leverages Unsloth and Huggingface's TRL library for accelerated training, making it suitable for applications requiring efficient processing in Konkani. With a context length of 32768 tokens, it is designed for language generation and understanding tasks specific to Konkani.
Loading preview...