darkc0de/UnderbossUncensored
darkc0de/UnderbossUncensored is a 10.7 billion parameter Llama-based language model developed by darkc0de, fine-tuned from Sao10K/Fimbulvetr-11B-v2. This model was trained using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging its Llama architecture for broad applicability.
Loading preview...
Model Overview
darkc0de/UnderbossUncensored is a 10.7 billion parameter language model built upon the Llama architecture. It was fine-tuned from the Sao10K/Fimbulvetr-11B-v2 model, indicating a focus on refining existing strong base capabilities. The development process utilized Unsloth and Huggingface's TRL library, which facilitated a 2x faster training speed.
Key Characteristics
- Base Model: Fine-tuned from Sao10K/Fimbulvetr-11B-v2, inheriting its foundational strengths.
- Training Efficiency: Leverages Unsloth for accelerated training, suggesting an optimized and efficient development approach.
- Parameter Count: With 10.7 billion parameters, it offers a balance between performance and computational requirements.
- Context Length: Supports a context length of 4096 tokens, suitable for processing moderately long inputs.
Potential Use Cases
This model is suitable for a variety of general-purpose language tasks, including text generation, summarization, and question answering, benefiting from its Llama-based architecture and efficient fine-tuning.