WokeAI/tankle-dpe-ep2-v2-slop-wah
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

WokeAI/tankle-dpe-ep2-v2-slop-wah is a 12 billion parameter language model fine-tuned by WokeAI, based on PocketDoc/Dans-PersonalityEngine-V1.1.0-12b. This model was trained for 2 epochs on the WokeAI/polititune-tankie-warmup-2 dataset, utilizing a 32768 token context length. It is specifically adapted for tasks related to its fine-tuning dataset, suggesting a focus on politically-oriented text generation or analysis.

Loading preview...

Model Overview

WokeAI/tankle-dpe-ep2-v2-slop-wah is a 12 billion parameter language model developed by WokeAI. It is a fine-tuned iteration of the PocketDoc/Dans-PersonalityEngine-V1.1.0-12b base model, specifically adapted through further training.

Training Details

This model underwent a fine-tuning process for 2 epochs, utilizing a micro batch size of 2 and a gradient accumulation of 2 steps. The training incorporated a sequence length of 2048 tokens with sample packing enabled, and a total context length of 32768 tokens. The learning rate was set to 1e-05 with a constant scheduler and a warmup ratio of 0.05. The training was performed on the WokeAI/polititune-tankie-warmup-2 dataset, indicating a specialized focus on content related to this dataset's characteristics.

Key Characteristics

  • Base Model: Fine-tuned from PocketDoc/Dans-PersonalityEngine-V1.1.0-12b.
  • Parameter Count: 12 billion parameters.
  • Context Length: Supports a substantial context of 32768 tokens.
  • Training Dataset: Specialized fine-tuning on WokeAI/polititune-tankie-warmup-2.

Intended Use Cases

Given its specific fine-tuning dataset, this model is likely intended for applications requiring generation or analysis of text aligned with the themes and style present in the WokeAI/polititune-tankie-warmup-2 dataset. Developers should consider its specialized training for tasks where this particular data distribution is relevant.