MrRikyz/StarlightMoon-Foxfire-12B

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 24, 2026Architecture:Transformer0.0K Cold

StarlightMoon-Foxfire-12B by MrRikyz is a 12 billion parameter merged language model, built upon SicariusSicariiStuff's Impish_Bloodmoon_12B and other 12B roleplay models. Utilizing the model_stock merge method, it integrates components from several specialized models to enhance its capabilities. This model is specifically designed and optimized for roleplay scenarios, offering a specialized blend for interactive narrative generation.

Loading preview...

Overview

StarlightMoon-Foxfire-12B is a 12 billion parameter language model developed by MrRikyz, created through a sophisticated merge of several existing 12B models. It is primarily built on Impish_Bloodmoon_12B by SicariusSicariiStuff, serving as its foundational base.

Key Characteristics

  • Merge Method: The model was constructed using the model_stock merge method via mergekit.
  • Component Models: It integrates contributions from a diverse set of 12B models, including:
    • SicariusSicariiStuff/Impish_Bloodmoon_12B (Base) and Impish_Bloodmoon_12B_Abliterated
    • DreadPoor/Famino-12B-Model_Stock
    • PygmalionAI/Eleusis-12B and PygmalionAI/Pygmalion-3-12B
    • MrRikyz/Foxfire_Bloom
    • Vortex5/Azure-Starlight-12B
  • Merge Parameters: The merge process utilized a lambda value of 0.82, with the output dtype set to bfloat16.

Intended Use

This model is specifically designed for applications requiring advanced roleplay capabilities, leveraging the combined strengths of its constituent models to generate nuanced and engaging interactive narratives.