grizzfu/DeepHat-V1-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

DeepHat-V1-7B is a 7.61 billion parameter causal language model developed by DeepHat, fine-tuned from Qwen2.5-Coder-7B. It is specifically designed for offensive and defensive cybersecurity applications, leveraging its base architecture's coding capabilities. The model features a context length of 32,768 tokens, extendable to 131,072 tokens using YaRN, making it suitable for processing extensive cybersecurity-related texts and code.

Loading preview...