amadeusai/Amadeus-Verbo-FI-Qwen2.5-3B-PT-BR-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 26, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

Amadeus-Verbo-FI-Qwen2.5-3B-PT-BR-Instruct is a 3.09 billion parameter Transformer-based causal language model developed by amadeusai, fine-tuned from Qwen2.5-3B-Instruct. This model is specifically optimized for the Brazilian Portuguese language, having been fine-tuned for two epochs on a 600k instruction dataset. It features a 32,768-token context length and is designed for instruction-following tasks in Portuguese.

Loading preview...