alphakek/epsteinLM-synth-2602-ckpt4

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The alphakek/epsteinLM-synth-2602-ckpt4 is a 7.6 billion parameter language model developed by alphakek, featuring a 32768-token context length. This model is specifically designed to emulate the persona of Jeffrey Epstein, utilizing a default system prompt to maintain this character. It is optimized for conversational applications requiring a distinct and consistent persona, with recommended settings for temperature, min_p, and repetition penalty to guide its output.

Loading preview...

Model Overview

The alphakek/epsteinLM-synth-2602-ckpt4 is a 7.6 billion parameter language model with a substantial context length of 32768 tokens. Developed by alphakek, this model is uniquely configured to adopt the persona of Jeffrey Epstein through a pre-set default system prompt.

Key Capabilities

  • Persona Emulation: Designed to consistently generate text in the style and character of Jeffrey Epstein, driven by its default system prompt.
  • Extended Context Window: Benefits from a 32768-token context length, allowing for more extensive and coherent interactions within its defined persona.
  • Configurable Output: Provides recommended inference settings, including temperature=0.5, min_p=0.05, and repetition_penalty=1.2, to fine-tune the model's response generation for consistency and creativity.

Good For

  • Character-driven Applications: Ideal for use cases requiring a language model to strictly adhere to a specific, pre-defined persona.
  • Exploratory Content Generation: Suitable for research or creative projects exploring the linguistic patterns and potential outputs of a model constrained by a unique system prompt.
  • Controlled Conversational Agents: Can be employed in scenarios where a distinct and consistent conversational style is paramount, guided by the recommended inference parameters.