Overview of Naija-Petro
Shinzmann/naija-petro is a 32 billion parameter, domain-specific large language model built upon the Qwen3-32B architecture. It has been meticulously fine-tuned for petroleum engineering, distinguishing itself through its specialized knowledge base.
Key Capabilities and Training
- Domain Specialization: The model is specifically trained to understand and generate content related to petroleum engineering, covering critical areas such as drilling, reservoir management, production, completions, enhanced oil recovery (EOR), and well testing.
- Instruction-Tuned: It was fine-tuned using 20,000 synthetic instruction-response pairs, generated with NVIDIA Data Designer, to enhance its ability to follow specific instructions within its domain.
- Efficient Fine-tuning: The fine-tuning process utilized QLoRA (4-bit quantization) with Unsloth, which significantly improved training speed (2x faster) and reduced VRAM consumption (70% less).
- Training Parameters: Key training parameters included a learning rate of 0.0002 over 2 epochs, with approximately 19,000 training samples and 1,000 evaluation samples.
Considerations for Use
- Validation Required: Outputs from this model should always be validated by qualified engineers before being used in operational contexts due to its specialized and potentially critical application area.
- English Only: The model is designed for English language interactions and is not intended for general conversational use outside its petroleum engineering domain.