segolilylabs/Lily-Cybersecurity-7B-v0.2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

Lily-Cybersecurity-7B-v0.2 by segolilylabs is a 7 billion parameter Mistral-based instruction-tuned model with a 4096 token context length. It is specifically fine-tuned on 22,000 hand-crafted cybersecurity and hacking-related data pairs, enhanced with additional LLM-generated context and personality. This model is designed to function as a helpful and friendly cybersecurity assistant, providing detailed explanations across a broad spectrum of cybersecurity topics.

Loading preview...