georgewbabu/nova-v2-security
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 19, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The georgewbabu/nova-v2-security is an 8 billion parameter Qwen3 causal language model, developed by georgewbabu and fine-tuned from unsloth/qwen3-8b-unsloth-bnb-4bit. This model was optimized for training speed using Unsloth and Huggingface's TRL library. With a context length of 32768 tokens, it offers efficient processing for various language tasks. Its primary differentiator is its rapid training methodology, making it suitable for applications requiring quick iteration and deployment.

Loading preview...