UmbrellaInc/Hans_Wesker-1B

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 4, 2026License:gemmaArchitecture:Transformer Warm

UmbrellaInc/Hans_Wesker-1B is a 1 billion parameter Gemma-3 architecture model, developed by UmbrellaInc, representing a failed experimental merge. This model exhibits low refusal rates, a cold and superior tone that often devolves into incoherent output, and a bias towards NSFW/taboo content. It is optimized for low-resource environments, running on 3-4 GB, and is primarily intended for psychological domination roleplay or cold verbal humiliation tasks, despite its instability.

Loading preview...

Overview

Hans_Wesker-1B is an experimental 1 billion parameter model based on the Gemma-3 architecture, developed by UmbrellaInc. It is described as a "failed evolution" and a "corpse of a failed attempt," resulting from an aggressive merge process using DARE TIES. The model's creation involved a blend of "Prototype-Virus-1B," "PG67A-W-Serum-1B," and "W.Project-1B," with specific parameters (Lambda -0.70, rescale 1.30) applied to remove alignment and morality.

Key Characteristics

  • Low-end friendly: Operates efficiently on 3-4 GB of VRAM.
  • Refusal rate: Generally low, though prone to depressive loops.
  • NSFW / taboo bias: Persistent, but can be disrupted by "Serum" leading to incoherent output.
  • Tone: Naturally cold and superior for short durations, often devolving into babbling.
  • Multilingual: Dominantly Spanish, interspersed with random, nonsensical phrases.
  • Untied embeddings: Prone to redefining reality or breaking coherence.

Intended Usage

This model is specifically designed for niche applications, despite its inherent instability:

  • Psychological domination RP: If users can tolerate frequent derailments.
  • Cold verbal humiliation: Provided it doesn't veer into irrelevant personal lore.
  • Low-resource arrogance sessions: Accompanied by occasional model panic attacks.
  • Calculated cruelty: Or for users who enjoy the "second-hand embarrassment" of its failures.

Not Recommended For

  • Users seeking coherence or stability.
  • Those who dislike "cringe" content.
  • Anyone expecting a stable model from an extreme 1B DARE TIES merge.