uselevers/levers-base-najdi-72b-it-merged
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Feb 6, 2026License:apache-2.0Architecture:Transformer Open Weights Gated Cold

The uselevers/levers-base-najdi-72b-it-merged model is a 72 billion parameter Qwen2-based instruction-tuned language model developed by uselevers. It was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. This model is designed for general instruction-following tasks, leveraging its large parameter count for robust performance.

Loading preview...