Overview
Redemption_Wind_24B is a 24 billion parameter model developed by SicariusSicariiStuff, based on the Mistral 24B architecture. It is intentionally "undercooked" with a target average loss value of 8.0, making it highly receptive to further fine-tuning and merging. The model is ChatML-ified and incorporates high-quality private instruct data, ensuring good markdown understanding and minimal refusals, though some pre-trained alignment is noted.
Key Capabilities
- Foundation for Fine-tuning: Designed as an accessible and adaptable base for developers to build upon.
- Roleplay & Creative Writing: Lightly fine-tuned with high-quality private datasets for creative writing and roleplay, including entries up to 16k tokens.
- Character Adherence: Demonstrates exceptional adherence to character cards, making it suitable for roleplay scenarios.
- Low Refusals: Engineered to have minimal refusals, providing a more open-ended generation experience.
Good For
- Fine-tuning and Merging: Its primary use case is serving as a robust base for developers to create specialized models.
- Roleplay Applications: Excels at following character cards and generating creative roleplay content.
- Creative Writing: Capable of generating stories and creative text, benefiting from its specialized dataset.
Despite being a base model, observations suggest some assistant-oriented characteristics from its pre-training, which SicariusSicariiStuff has worked to dilute and adapt for broader utility.