WhiteRabbitNeo/Llama-3.1-WhiteRabbitNeo-2-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Aug 19, 2024License:llama3.1Architecture:Transformer0.0K Cold

WhiteRabbitNeo/Llama-3.1-WhiteRabbitNeo-2-8B is an 8 billion parameter language model developed by WhiteRabbitNeo, extending the Llama-3.1 architecture with a 32768 token context length. This model is specifically designed and fine-tuned for offensive and defensive cybersecurity applications, focusing on identifying vulnerabilities and security analysis. It excels at topics such as open port identification, outdated software detection, misconfigurations, injection flaws, and known software vulnerabilities.

Loading preview...