JoaoReiz/Llama3.2_3B_LeNER
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The JoaoReiz/Llama3.2_3B_LeNER is a 3.2 billion parameter Llama-based instruction-tuned language model developed by JoaoReiz. Finetuned from unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit, this model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general language understanding and generation tasks, leveraging its efficient training methodology.

Loading preview...