nova:24b - Cybersecurity Domain LLM
pki/nova-24b-cybersec is a 24 billion parameter language model developed by PKI, specifically fine-tuned for comprehensive cybersecurity applications. It is based on the Dolphin3.0-R1-Mistral-24B architecture and features a substantial 32,768-token context window, enabling it to process extensive security-related information.
Key Capabilities & Training
This model was trained on a specialized dataset of over 40,000 cybersecurity examples, including data from SecurityGPT, PKI Context, energy sector threats, and ISO 27001/27005 controls. This focused training allows it to perform exceptionally well across various security domains:
- Threat Modeling & Risk Assessment: Analyzing potential threats and evaluating risks.
- Incident Response: Assisting in the detection, analysis, and containment of security incidents.
- Vulnerability Management: Identifying and managing software and system vulnerabilities.
- Compliance: Understanding and applying standards like ISO 27001 and ISO 27005.
- Specialized Areas: Cryptography, adversarial machine learning, secure coding, and ICS/SCADA security.
Usage Considerations
For optimal performance and coherent output, it is critical to use a low temperature setting (0.05-0.1) during inference. The model is primarily trained on English text and requires significant VRAM due to its large parameter count. It is available in both Hugging Face Transformers format and quantized GGUF for Ollama/llama.cpp.