TheDrummer/Rocinante-X-12B-v1

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jan 19, 2026Architecture:Transformer0.1K Warm

Rocinante-X-12B-v1 is a 12 billion parameter language model developed by TheDrummer, specifically fine-tuned for creative writing, dynamic storytelling, and robust roleplaying with a 32768 token context length. Unlike models optimized for intelligence or correctness, Rocinante-X-12B-v1 prioritizes imagination and engaging dialogue, aiming to provide a superior experience for entertainment and creative use cases. It is noted for feeling larger than its 12B size, offering depth and creativity without sacrificing coherence in complex scenarios.

Loading preview...

Rocinante-X-12B-v1: A Creative Powerhouse

TheDrummer's Rocinante-X-12B-v1 is a 12 billion parameter model designed with a strong emphasis on creativity, usability, and entertainment. Building on the legacy of its predecessor, this model is specifically tuned for enhanced prose, engaging dialogue, and robust roleplaying experiences, distinguishing itself from models primarily focused on intelligence or problem-solving.

Key Capabilities

  • Exceptional Creativity: Excels in writing, dynamic storytelling, and imaginative narrative generation.
  • Robust Roleplaying: Optimized for immersive and consistent character adherence in roleplay scenarios.
  • Flexible (Dis)alignment: Designed to avoid corporate/religious/political biases and forced positivity, allowing for exploration of diverse themes.
  • High Adherence: Demonstrates strong instruction following and prompt understanding.
  • Extended Context: Supports a substantial 32768 token context length, enabling longer and more complex interactions.

Good for

  • Creative Writing: Ideal for generating compelling stories, scripts, and descriptive prose.
  • Roleplaying & Entertainment: Provides a superior experience for interactive fiction, character-driven narratives, and unaligned conversational agents.
  • Users Seeking Unrestricted Expression: Suitable for applications where models need to navigate complex or dubious themes without moralizing.
  • Resource-Efficient Depth: Offers performance and creative depth often associated with larger models, making it a strong contender in the 12B parameter bracket.