JBHarris/dm-llm-tiny

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

JBHarris/dm-llm-tiny is a 1.1 billion parameter language model, fine-tuned from TinyLlama/TinyLlama-1.1B-Chat-v1.0, specifically designed for generating Dungeons & Dragons (D&D) content. Utilizing QLoRA for efficient training, this model excels at creating NPCs, quests, dialogue, locations, and encounters. It serves as a specialized tool for D&D enthusiasts and Dungeon Masters seeking quick, creative brainstorming assistance.

Loading preview...

DM-LLM-Tiny: A Specialized D&D Content Generator

DM-LLM-Tiny is a compact 1.1 billion parameter language model, built upon the TinyLlama-1.1B-Chat-v1.0 base, and uniquely fine-tuned for generating creative content for Dungeons & Dragons. Developed by JBHarris, this model leverages QLoRA (4-bit NF4 quantization with LoRA r=64) on a dataset of approximately 500 synthetic D&D instruction/response pairs generated by Claude.

Key Capabilities

This model is engineered to assist Dungeon Masters and players by generating a variety of D&D-specific content, including:

  • NPCs: Crafting memorable non-player characters with backstories and motivations.
  • Quests: Developing plot hooks, outlines, and full quest arcs.
  • Dialogue: Creating in-character conversations, monologues, and banter.
  • Locations: Describing vivid dungeons, towns, and wilderness settings.
  • Encounters: Designing combat, social, and puzzle-based scenarios.

Good For

  • Quick Idea Generation: Ideal for brainstorming sessions and generating initial concepts rapidly.
  • D&D Enthusiasts: Provides a specialized tool for enhancing D&D campaigns.
  • Resource-Constrained Environments: Its small size makes it suitable for deployment with limited hardware, such as via Ollama.

Limitations

As a 1.1 billion parameter model, DM-LLM-Tiny offers creative and fun outputs but does not match the quality or coherence of significantly larger models (7B+). It is best utilized as a supplementary tool for brainstorming rather than a complete replacement for human creativity and judgment.