TucanoBR/Tucano-1b1-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Sep 30, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

TucanoBR/Tucano-1b1-Instruct is a 1.1 billion parameter instruction-tuned decoder-transformer model developed by TucanoBR, natively pretrained in Portuguese. It was fine-tuned using Supervised Fine-Tuning (SFT) and Direct Preference Optimization (DPO) on various instruction datasets. This model is specifically designed for research and development in native Portuguese language modeling, serving as a foundation for comparative experiments and adaptation.

Loading preview...