TucanoBR/Tucano-1b1
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Sep 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

TucanoBR/Tucano-1b1 is a 1.1 billion parameter Transformer-based causal language model, natively pretrained in Portuguese by TucanoBR. It was trained on the 200 billion token GigaVerbo corpus, making it specifically optimized for Portuguese language modeling research and development. This model serves as a foundational checkpoint for comparative experiments and fine-tuning in Portuguese.

Loading preview...