achinta3/llama_3.2_3b-owl_numbers_full_ep1
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Mar 24, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The achinta3/llama_3.2_3b-owl_numbers_full_ep1 is a 3.2 billion parameter Llama-based language model developed by achinta3, fine-tuned from unsloth/Llama-3.2-3B-Instruct. This model was trained using Unsloth and Huggingface's TRL library, indicating an optimization for faster training. With a context length of 32768 tokens, it is designed for general language tasks, leveraging its Llama architecture.

Loading preview...