JoaoReiz/Llama3.2_3B_LeNER

TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The JoaoReiz/Llama3.2_3B_LeNER is a 3.2 billion parameter Llama-based instruction-tuned language model developed by JoaoReiz. Finetuned from unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit, this model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general language understanding and generation tasks, leveraging its efficient training methodology.

Loading preview...

Model Overview

JoaoReiz/Llama3.2_3B_LeNER is a 3.2 billion parameter instruction-tuned language model developed by JoaoReiz. It is based on the Llama architecture and was finetuned from the unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit model.

Key Characteristics

  • Efficient Training: This model was trained with Unsloth and Huggingface's TRL library, which enabled a 2x faster training process compared to standard methods.
  • Llama-based Architecture: Leverages the robust Llama model family for its foundational capabilities.
  • Instruction-Tuned: Designed to follow instructions effectively, making it suitable for a variety of conversational and task-oriented applications.

Potential Use Cases

  • General Text Generation: Capable of generating coherent and contextually relevant text based on prompts.
  • Instruction Following: Can be used for tasks requiring adherence to specific instructions, such as summarization, question answering, or content creation.
  • Research and Development: Its efficient training methodology makes it an interesting candidate for further experimentation and finetuning on specific datasets.