SicariusSicariiStuff/Redemption_Wind_24B

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 6, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Redemption_Wind_24B by SicariusSicariiStuff is a 24 billion parameter, lightly fine-tuned Mistral 24B base model with a 32768 token context length. It is primarily designed as an adaptable foundation for further fine-tuning and merging, excelling in roleplay and creative writing due to its specialized training datasets and exceptional adherence to character cards. While generally usable, its core strength lies in providing a high-quality, low-refusal base for developers.

Loading preview...

Overview

Redemption_Wind_24B is a 24 billion parameter model developed by SicariusSicariiStuff, based on the Mistral 24B architecture. It is intentionally "undercooked" with a target average loss value of 8.0, making it highly receptive to further fine-tuning and merging. The model is ChatML-ified and incorporates high-quality private instruct data, ensuring good markdown understanding and minimal refusals, though some pre-trained alignment is noted.

Key Capabilities

  • Foundation for Fine-tuning: Designed as an accessible and adaptable base for developers to build upon.
  • Roleplay & Creative Writing: Lightly fine-tuned with high-quality private datasets for creative writing and roleplay, including entries up to 16k tokens.
  • Character Adherence: Demonstrates exceptional adherence to character cards, making it suitable for roleplay scenarios.
  • Low Refusals: Engineered to have minimal refusals, providing a more open-ended generation experience.

Good For

  • Fine-tuning and Merging: Its primary use case is serving as a robust base for developers to create specialized models.
  • Roleplay Applications: Excels at following character cards and generating creative roleplay content.
  • Creative Writing: Capable of generating stories and creative text, benefiting from its specialized dataset.

Despite being a base model, observations suggest some assistant-oriented characteristics from its pre-training, which SicariusSicariiStuff has worked to dilute and adapt for broader utility.