Naphula/Goetia-24B-v1.3

Warm
Public
24B
FP8
32768
2
Feb 3, 2026
License: apache-2.0
Hugging Face
Overview

Goetia 24B v1.3: A DELLA Merged Model

Goetia 24B v1.3, developed by Naphula, is a 24 billion parameter language model built on the Mistral architecture. This "Della Edition" is a unique merge of numerous pre-trained models, specifically engineered to bridge the performance gaps between different model families (2501 and 2503 models). The model leverages the advanced DELLA merge method to combine diverse strengths from its constituent models, aiming for a robust and versatile output.

Key Capabilities

  • Advanced Merging: Utilizes the DELLA method to integrate capabilities from over 30 distinct base models.
  • Broad Context: Supports a context length of 32768 tokens, enabling processing of extensive inputs.
  • Experimental Design: Represents an effort to explore novel merging strategies for improved model performance and versatility.

Good for

  • Researchers and developers interested in exploring advanced model merging techniques.
  • Applications requiring a model with a broad range of capabilities derived from a diverse set of base models.
  • Use cases benefiting from a 24B parameter model with a substantial context window.