Epiculous/NovaSpark

Warm
Public
8B
FP8
32768
License: apache-2.0
Hugging Face
Overview

NovaSpark Overview

NovaSpark is an 8 billion parameter language model developed by Epiculous. It is built upon GrimJim's "abliterated" version of arcee's Llama 3.1-SuperNova-Lite. The primary objective behind NovaSpark's training was to mitigate the inherent refusals and censorship often found in its foundational models, though further abliteration might be necessary to fully reinforce this characteristic.

Key Characteristics

  • Base Architecture: Derived from Llama 3.1-SuperNova-Lite.
  • Parameter Count: 8 billion parameters.
  • Training Focus: Aims to reduce model censorship and refusals through an "abliteration" process.
  • Prompting: Utilizes the standard Llama instruct template for optimal performance.

Usage and Prompting

NovaSpark is instruction-tuned and expects input formatted according to the Llama instruct template. This involves specific begin_of_text, start_header_id, end_header_id, and eot_id tokens to delineate system, user, and assistant turns. Users are encouraged to follow this structure for best results. The model also provides links to various sampler settings for fine-tuning output creativity and style.