CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
TEXT GENERATIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:May 25, 2025License:gemma Vision Architecture:Transformer0.2K Warm

GAIA (Gemma-3-Gaia-PT-BR-4b-it) is a 4.3 billion parameter causal decoder-only Transformer-based language model developed by CEIA-UFG, ABRI, Nama, Amadeus AI, and Google DeepMind. Continuously pre-trained on 13 billion tokens of high-quality Portuguese data, it is optimized for Brazilian Portuguese. This model excels at text generation and conversational tasks in Portuguese, demonstrating enhanced performance on specific Brazilian benchmarks like ENEM.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p