achinta3/llama_3.2_3b-owl_numbers_full_ep10
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Mar 24, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The achinta3/llama_3.2_3b-owl_numbers_full_ep10 is a 3.2 billion parameter Llama 3.2 instruction-tuned model, fine-tuned by achinta3 using Unsloth and Huggingface's TRL library. This model was trained with a focus on efficiency, achieving 2x faster training speeds. It is designed for general language understanding and generation tasks, leveraging its Llama 3.2 base for robust performance.

Loading preview...