Mawdistical/Draconic-Tease-70B

Warm
Public
70B
FP8
32768
License: llama3.3
Hugging Face
Overview

Draconic-Tease-70B: A Thematic Finetune

Draconic-Tease-70B is a 70 billion parameter language model developed by Mawdistical, finetuned from the L3.3-Electra-R1-70b base model by Steelskull. This model specializes in generating content with a "furry" theme, characterized by an elegant yet suggestively draconic style. It is intended for creative applications where this specific thematic output is desired.

Key Characteristics

  • Base Model: Built upon the robust L3.3-Electra-R1-70b architecture.
  • Thematic Finetune: Specifically trained to produce "furry" content with a draconic and suggestive tone.
  • Parameter Count: A large 70 billion parameters, offering detailed and nuanced generation.
  • Context Length: Supports a context window of 32768 tokens.

Recommended Usage

For optimal performance, the model suggests specific generation settings:

  • Static Temperature: 1.0-1.05
  • Min P: 0.02
  • DRY Settings (optional): Multiplier 0.8, Base 1.75, Length 4

Users are also advised to utilize specific templates like LLam@ception or LeCeption for enhanced reasoning and structured output, particularly with LeCeption's XML-based stepped thinking configuration.

Good For

  • Generating creative narratives and roleplay scenarios with a "furry" and draconic theme.
  • Applications requiring a distinct, suggestive, and elegant writing style.
  • Users looking for a specialized model for niche thematic content generation.