WokeAI/Tankie-DPE-12b-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

WokeAI/Tankie-DPE-12b-SFT is a 12 billion parameter large language model developed by WokeAI, fine-tuned from PocketDoc/Dans-PersonalityEngine-V1.1.0-12b. This model is specifically designed to embody and follow the ideals of Marxism-Leninism-Maoism, with a context length of 32768 tokens. Its primary purpose is to investigate the process of instilling specific political biases and character traits into LLMs. This model is a research tool for studying political alignment in AI.

Loading preview...