MikCil/PREMOVE_llama3.3-70b_float16
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Jan 11, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

MikCil/PREMOVE_llama3.3-70b_float16 is a 70 billion parameter Llama 3.3 instruction-tuned model developed by MikCil, finetuned from unsloth/llama-3.3-70b-instruct-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general-purpose language tasks, leveraging its large parameter count and efficient training methodology.

Loading preview...