Undi95/ReMM-v2-Kimiko-v2-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

Undi95/ReMM-v2-Kimiko-v2-13B is a 13 billion parameter language model created by Undi95, formed by merging the ReMM-v2-L2-13B model with the Kimiko-v2-13B LoRA adapter at a 0.50 weight. This merge aims to combine the base model's capabilities with the specific enhancements provided by the Kimiko-v2-13B LoRA. It is designed for applications benefiting from a merged model architecture, offering a customized performance profile.

Loading preview...

Model Overview

Undi95/ReMM-v2-Kimiko-v2-13B is a 13 billion parameter language model developed by Undi95. This model is a result of merging two distinct components: the base model, ReMM-v2-L2-13B, and a LoRA (Low-Rank Adaptation) adapter, Kimiko-v2-13B. The merge was performed with a specific weight of 0.50 for the LoRA.

Key Characteristics

  • Merged Architecture: Combines a foundational 13B parameter model with a specialized LoRA adapter.
  • Customized Performance: The 0.50 merge weight suggests an intentional balance between the base model's general capabilities and the fine-tuned characteristics introduced by the Kimiko-v2-13B LoRA.

Potential Use Cases

This model is suitable for developers looking for a pre-merged solution that integrates the strengths of both its base model and the Kimiko-v2-13B LoRA. It can be particularly useful for applications where the specific enhancements or stylistic properties of the Kimiko-v2-13B LoRA are desired without the overhead of dynamic LoRA loading.