Valeciela/KansenSakura-Symbiosis-12B
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Valeciela/KansenSakura-Symbiosis-12B is a 12 billion parameter experimental language model with a 32768 token context length, created by Valeciela. This model is a Multi-SLERP merge of several KansenSakura-RP-12b models, using Retreatcost/KansenSakura-Zero-RP-12b as its base. It is designed for specific, experimental applications, leveraging its merged architecture for potentially unique generative capabilities.

Loading preview...

KansenSakura-Symbiosis-12B Overview

KansenSakura-Symbiosis-12B is an experimental 12 billion parameter language model developed by Valeciela, featuring a substantial 32768 token context window. This model is a product of a Multi-SLERP merge process, building upon Retreatcost/KansenSakura-Zero-RP-12b as its foundational base.

Merge Details

The model integrates contributions from three distinct KansenSakura-RP-12b variants:

  • Retreatcost/KansenSakura-Erosion-RP-12b
  • Retreatcost/KansenSakura-Eclipse-RP-12b
  • Retreatcost/KansenSakura-Radiance-RP-12b

Each of these models contributed with a weight of 0.5 during the Multi-SLERP merge, aiming to combine their respective strengths. The configuration specifies a chatml chat template and bfloat16 dtype, with normalization applied during the merge.

Usage and Availability

This model is provided in GGUF format, with both a standard version and a Q6_K_XL quantized version available for download. Users are advised that this is an experimental model and should be used with caution.