uselevers/levers-base-najdi-72b-it-merged

TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Feb 6, 2026License:apache-2.0Architecture:Transformer Open Weights Gated Cold

The uselevers/levers-base-najdi-72b-it-merged model is a 72 billion parameter Qwen2-based instruction-tuned language model developed by uselevers. It was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. This model is designed for general instruction-following tasks, leveraging its large parameter count for robust performance.

Loading preview...

Model Overview

The uselevers/levers-base-najdi-72b-it-merged is a large language model developed by uselevers. It is based on the Qwen2 architecture and features 72 billion parameters, making it suitable for a wide range of complex natural language processing tasks.

Key Characteristics

  • Base Model: Finetuned from unsloth/qwen2.5-72b-instruct-bnb-4bit.
  • Training Efficiency: This model was trained significantly faster, achieving 2x speedup, by utilizing the Unsloth library in conjunction with Huggingface's TRL library.
  • Instruction-Tuned: Optimized for understanding and following instructions, making it versatile for various prompt-based applications.

Intended Use Cases

This model is well-suited for applications requiring a powerful instruction-following LLM, benefiting from its large parameter count and efficient training methodology. It can be applied to tasks such as:

  • General-purpose text generation
  • Question answering
  • Summarization
  • Creative writing
  • Code generation (if the base Qwen2.5 model supports it)