Naphula/Quill-v1-abliterated

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kLicense:gemmaArchitecture:Transformer0.0K Cold

Naphula/Quill-v1-abliterated is a 9 billion parameter language model, primarily created for testing purposes related to model merging. This version has not undergone MPOA ablation, which may result in cognitive decline or bugged outputs. It is specifically noted as unsuitable for direct use in merges, instead serving as a preliminary test for integration into a 'psycho merge' before proper ablation.

Loading preview...

Model Overview

Naphula/Quill-v1-abliterated is a 9 billion parameter model developed by Naphula, primarily intended for experimental testing within model merging workflows. This specific iteration has not been ablated using MPOA (Memory-Preserving Orthogonal Ablation), which the developer notes could lead to potential cognitive degradation or unexpected outputs.

Key Characteristics

  • Parameter Count: 9 billion parameters.
  • Context Length: Supports a context length of 16384 tokens.
  • Ablation Status: This version is explicitly stated as not having undergone MPOA ablation.
  • Purpose: Created for internal testing to observe its behavior within a 'psycho merge' context.

Usage Recommendations

  • Not for Direct Merging: The developer strongly advises against using this model directly in merges due to its unablated state and potential for bugged output.
  • Post-Merge Ablation: It is recommended that models be merged first, and then ablated using methods like norm-preserving biprojected abliteration for optimal performance and stability.