playkill/Qwen2.5-Sex

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 15, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Qwen2.5-Sex is a 1.5 billion parameter instruction-tuned causal language model developed by playkill, based on Qwen2.5-1.5B-Instruct. Fine-tuned extensively on large volumes of Chinese erotic literature and sensitive datasets, this model is optimized for generating content related to these themes, particularly in Chinese. It features a 32K context length and is intended for research and testing purposes in specialized content generation.

Loading preview...

Overview

Qwen2.5-Sex is a 1.5 billion parameter instruction-tuned model derived from Qwen2.5-1.5B-Instruct. Developed by playkill, its primary distinction lies in its extensive fine-tuning on a substantial collection of Chinese erotic literature and sensitive datasets. This specialized training makes the model particularly adept at generating content within these domains, with a notable emphasis on Chinese text due to the dataset's linguistic composition.

Key Capabilities

  • Specialized Content Generation: Proficient in producing text related to erotic literature and sensitive topics.
  • Chinese Language Optimization: Demonstrates enhanced performance and fluency when processing and generating Chinese text, attributed to its Chinese-centric training data.
  • Context Length: Supports a context window of 32,768 tokens, allowing for more extended and coherent generations.

Training Data

The model was fine-tuned using a variety of datasets, including:

Intended Use

This model is explicitly provided for research and testing purposes only. Users are cautioned to adhere to local laws and regulations and are responsible for their own actions when utilizing the model. The developers disclaim responsibility for any misuse.