georgesung/llama3_8b_chat_uncensored

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 30, 2024License:otherArchitecture:Transformer0.0K Warm

georgesung/llama3_8b_chat_uncensored is an 8 billion parameter Llama-3 8B model fine-tuned by georgesung using QLoRA. It was trained on an uncensored/unfiltered Wizard-Vicuna conversation dataset, providing an unfiltered conversational style. This model is designed for chat applications requiring less restrictive content generation, offering an 8192 token context length.

Loading preview...

Model Overview

georgesung/llama3_8b_chat_uncensored is an 8 billion parameter language model based on Meta's Llama-3 8B architecture. It has been fine-tuned by georgesung using QLoRA (Quantized Low-Rank Adaptation) to achieve an uncensored and unfiltered conversational style. The training utilized a Wizard-Vicuna conversation dataset, making it suitable for applications where a less restricted response generation is desired.

Key Capabilities

  • Uncensored Chat: Designed to provide unfiltered responses, differing from standard moderated models.
  • Llama-3 8B Base: Leverages the strong foundational capabilities of the Llama-3 8B model.
  • QLoRA Fine-tuning: Efficiently fine-tuned, making it accessible for various deployment scenarios.
  • GGUF Quantization: A 4-bit q4_0 GGUF quantized version is available for optimized local inference, compatible with tools like Ollama.

Use Cases

  • Unrestricted Conversational AI: Ideal for chat applications that require responses without typical content filters.
  • Research and Development: Useful for exploring the behavior of LLMs with less content moderation.
  • Local Deployment: The provided GGUF version facilitates easy local deployment and experimentation using platforms like Ollama.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p