JoaoReiz/Llama3.2_1B_cachacaNER
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 31, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The JoaoReiz/Llama3.2_1B_cachacaNER is a 1 billion parameter Llama 3.2 instruction-tuned language model developed by JoaoReiz. It was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. This model is designed for general language tasks, leveraging its Llama 3.2 architecture and efficient training methodology.

Loading preview...