WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Sep 28, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B is a 7 billion parameter model from WhiteRabbitNeo, built on the Qwen 2.5 architecture. This model series is specifically designed and optimized for offensive and defensive cybersecurity applications. It excels at generating functional and production-ready code, making it suitable for security-focused development tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p