Undi95/Unholy-v1-12L-13B

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 10, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Undi95/Unholy-v1-12L-13B is a 13 billion parameter experimental language model developed by Undi95, built upon a merge of uukuguy/speechless-llama2-luban-orca-platypus-13b and jondurbin/spicyboros-13b-2.2, with layers from Undi95/MLewd-L2-13B-v2-3. This model is specifically engineered to be uncensored, aiming to bypass common content filters. It is designed for use cases requiring unrestricted language generation, operating with a 4096 token context length.

Loading preview...

Overview

Undi95/Unholy-v1-12L-13B is a highly experimental 13 billion parameter language model, developed by Undi95, with a primary focus on uncensored language generation. It was created through a unique merging process, combining elements from uukuguy/speechless-llama2-luban-orca-platypus-13b and jondurbin/spicyboros-13b-2.2, and integrating layers from Undi95/MLewd-L2-13B-v2-3 to actively circumvent typical content censorship mechanisms.

Key Characteristics

  • Uncensored Output: Explicitly designed to bypass common content filters and generate unrestricted text.
  • Hybrid Architecture: Constructed by merging and re-layering components from multiple existing 13B models.
  • Experimental Nature: Labeled as "highly experimental" by its creator, indicating ongoing development and potential instability.
  • Alpaca Prompt Template: Utilizes the Alpaca instruction format for interaction.

Performance Benchmarks

Based on Open LLM Leaderboard evaluations, Unholy-v1-12L-13B achieves an average score of 50.65. Notable scores include:

  • ARC (25-shot): 63.57
  • HellaSwag (10-shot): 83.75
  • MMLU (5-shot): 58.08
  • TruthfulQA (0-shot): 51.09

Use Cases

This model is intended for developers and researchers who require a language model capable of generating content without built-in censorship, particularly for exploring the boundaries of LLM behavior and content generation.