Novaciano/Esperpento-1B
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Feb 6, 2026Architecture:Transformer Warm

Novaciano/Esperpento-1B is a 1 billion parameter language model created by Novaciano, merged using the TIES method from several pre-trained models including Novaciano/Heretic.Erudite_v2-1B. This model leverages a 32768 token context length and is designed as a composite model, combining characteristics of its constituent bases for broad applicability.

Loading preview...

Esperpento-1B: A Merged Language Model

Esperpento-1B is a 1 billion parameter language model developed by Novaciano, created through a merge of several existing pre-trained models. This model utilizes the TIES merge method, a technique designed to combine the strengths of multiple models into a single, cohesive unit.

Merge Details

The base model for this merge was DavidAU/gemma-3-1b-it-heretic-extreme-uncensored-abliterated. The following models were integrated into Esperpento-1B:

Key Characteristics

  • Parameter Count: 1 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a substantial context window of 32768 tokens, enabling processing of longer inputs and maintaining conversational coherence.
  • Merge Method: Built using mergekit with the TIES method, indicating an approach to consolidate diverse model capabilities.

Potential Use Cases

Given its merged nature and 1B parameter size, Esperpento-1B is suitable for applications requiring a compact yet capable language model, potentially inheriting varied strengths from its constituent models. Its large context window makes it viable for tasks involving extended text analysis or generation.