amadeusai/Amadeus-Verbo-BI-Qwen-2.5-0.5B-PT-BR-Instruct-Experimental
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Warm

Amadeus-Verbo-BI-Qwen-2.5-0.5B-PT-BR-Instruct-Experimental is a 0.49 billion parameter instruction-tuned causal language model developed by amadeusai, based on the Qwen2.5 architecture. Fine-tuned for 2 epochs on 600k instructions, it is specifically optimized for tasks in Brazilian Portuguese. This model features a 131,072 token context length and is designed for generating text in Portuguese.

Loading preview...