JoaoReiz/Llama3.2_3B_Paramopama
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026License:apache-2.0Architecture:Transformer Open Weights Loading
JoaoReiz/Llama3.2_3B_Paramopama is a 3.2 billion parameter Llama-based instruction-tuned language model developed by JoaoReiz. This model was finetuned from unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging its efficient training methodology.
Loading preview...