shabieh2/3370_0412
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:8kPublished:Apr 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The shabieh2/3370_0412 model is a 70 billion parameter Llama-3.3-Instruct variant developed by shabieh2. This model was fine-tuned using Unsloth, a technique designed to accelerate training. It is optimized for tasks typically handled by large instruction-following language models, leveraging its substantial parameter count for complex reasoning and generation.

Loading preview...