luckycanucky/NeuralDaredevil-Toxic-32-64-2e
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Aug 14, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
NeuralDaredevil-Toxic-32-64-2e is an 8 billion parameter Llama model developed by luckycanucky, fine-tuned from mlabonne/NeuralDaredevil-8B-abliterated. This model was trained twice as fast using Unsloth and Huggingface's TRL library, offering efficient performance for various natural language processing tasks. Its primary use case is general-purpose text generation and understanding, benefiting from its optimized training process.
Loading preview...