SlerpE/CardProjector-24B-v3

5.0 based on 1 review
Warm
Public
24B
FP8
32768
License: apache-2.0
Hugging Face
Overview

CardProjector-24B-v3: Specialized Character Generation

CardProjector-24B-v3, developed by AlexBefest, is a 24 billion parameter language model specifically fine-tuned for generating and editing character cards, primarily for use with SillyTavern. Building upon the Mistral-Small-24B-Instruct-2501 architecture, this iteration introduces substantial enhancements in character development and format flexibility.

Key Capabilities:

  • Enhanced Character Generation: Significantly improved ability to create detailed characters from ordinary natural language prompts, moving beyond strict structured formats.
  • Advanced Editing: Offers colossal improvements in editing existing character descriptions.
  • SillyTavern JSON Support: Restored and improved functionality for generating characters directly in SillyTavern's import-ready JSON format.
  • Universal Format Conversion: Capable of converting any character description, regardless of its initial format or quality, into SillyTavern JSON.
  • YAML Integration: Added support for generating, editing, and converting characters in YAML format, which is highly recommended for its human-readability and superior processing by roleplay models.
  • Creative Writing & Logic: Features significant improvements in creative writing and enhanced logical depth in character development.
  • Increased Stability: Addresses and fixes infinite generation loops, making the model more robust and capable of working across various human-readable formats.

Performance:

On the Character Creation & Editing validation dataset, CardProjector-24B-v3 achieved a PPL (Perplexity) score of 4.3368, indicating strong performance, with lower scores being better. This represents a notable improvement over its predecessor, CardProjector-24B-v1 (5.0515).

Usage Recommendations:

For optimal results, it is recommended to first generate a character in a standard, natural language format. After any necessary adjustments, convert the final version into YAML for its structured benefits, or directly into JSON for SillyTavern import. The model is designed to be used with the Mistral V7 chat template, with suggested parameters for balanced output including Temperature: 0.7-0.8, Top-P: 0.92, Rp.Pen: 1.07, and Top-K: 100.