allura-org/TQ2.5-14B-Aletheia-v1

Warm
Public
14B
FP8
32768
License: apache-2.0
Hugging Face
Overview

Model Overview

allura-org/TQ2.5-14B-Aletheia-v1 is a 14 billion parameter language model developed by Auri, designed as a hybrid for roleplay (RP) and story writing. This model is a merge of allura-org/TQ2.5-14B-Sugarquill-v1 and allura-org/TQ2.5-14B-Neon-v1, aiming to balance Sugarquill's creative output with improved steerability for RP scenarios. The merging process, which involved over 20 attempts, utilized the SLERP method, as TIES proved ineffective.

Key Capabilities & Characteristics

  • Hybrid RP/Story Model: Specifically tuned to excel in both interactive roleplay and longer-form story generation.
  • Creative Spark: Retains the creative writing capabilities inherited from the Sugarquill component.
  • Steerability: Designed to be more controllable for roleplay compared to its base models, though it can be sensitive to prompt instructions and sampler configurations.
  • Context Length: Supports a substantial context window of 32768 tokens.
  • ChatML Format: Responds to the standard ChatML instruct formatting, consistent with its base model.

Recommended Usage

This model is suitable for:

  • Roleplay: Engaging in back-and-forth character interactions.
  • Storywriting: Generating narrative content, either through raw completion or collaborative co-writing.

Users should note its sensitivity to low-depth instructions and sampler settings. Recommended sampler configurations are provided in the model's documentation, including specific values for Temperature, Top-A, TFS, and DRY settings, along with a ST Master Import for easy setup.