azale-ai/DukunLM-7B-V1.0-Uncensored

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Aug 13, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

DukunLM-7B-V1.0-Uncensored by azale-ai is a 7 billion parameter language model specifically fine-tuned for Indonesian text generation. Built upon the ehartford/WizardLM-7B-V1.0-Uncensored base model, it leverages the MBZUAI/Bactrian-X (Indonesian subset) dataset and QLoRA for efficient training. This model is designed for generating Indonesian language content, offering an uncensored output without filters or alignment.

Loading preview...

DukunLM-7B-V1.0-Uncensored Overview

DukunLM-7B-V1.0-Uncensored is a 7 billion parameter language model developed by azale-ai, specifically engineered for Indonesian text generation. It is an updated, full model release, building upon the ehartford/WizardLM-7B-V1.0-Uncensored base model. The model was fine-tuned using the Indonesian subset of the MBZUAI/Bactrian-X dataset and the QLoRA method, employing an Alpaca-style prompt format.

Key Capabilities

  • Indonesian Text Generation: Optimized for generating content in the Indonesian language.
  • Uncensored Output: Provides responses without inherent filters or alignment, offering raw text generation capabilities.
  • Efficient Fine-tuning: Utilizes QLoRA for efficient adaptation from its English base model.

Limitations

  • Uncensored Nature: Lacks filters or alignment, which may lead to the generation of errors, cultural biases, or potentially offensive content.
  • Base Model Language: Inherits characteristics from its English base model, with fine-tuning applied for Indonesian.

Good For

  • Developers and researchers requiring a dedicated Indonesian language model.
  • Applications where uncensored text generation is specifically desired or managed externally.
  • Experimentation with language models focused on specific cultural and linguistic contexts within Indonesia.