Overview
Abhi1907/KALI-V1, developed by DeepHat and Kindo.ai, is a 7.6 billion parameter language model fine-tuned from Qwen2.5-Coder-7B. This model is part of the DeepHat series, which focuses on applications within the cybersecurity domain, encompassing both offensive and defensive strategies, as well as DevOps tasks. It inherits the robust transformer architecture of its base model, including RoPE, SwiGLU, and RMSNorm.
Key Capabilities & Features
- Cybersecurity and DevOps Expertise: Specifically trained to excel in tasks related to cybersecurity and DevOps.
- Base Architecture: Built upon Qwen2.5-Coder-7B, featuring a causal language model architecture.
- Context Length: Supports a standard context window of 32,768 tokens.
- Long Text Processing: Can be configured to handle longer texts using the YaRN technique, extending its effective context beyond the default.
- Parameter Count: Comprises 7.61 billion parameters (6.53 billion non-embedding parameters).
Usage Considerations
- License: Operates under an Apache-2.0 license with a DeepHat Extended Version, which includes specific usage restrictions, particularly prohibiting military use and certain harmful applications.
- Responsible AI: Users are solely responsible for their use of the model and must adhere to the outlined usage restrictions and terms of use, which disclaim liability for misuse.
- Integration: Requires
transformers library version 4.37.0 or newer for proper functionality.