Undi95/ReMM-S-Kimiko-v2-13B

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

Undi95/ReMM-S-Kimiko-v2-13B is a 13 billion parameter language model created by Undi95, formed by merging a LoRA adapter (Kimiko-v2-13B) with the base model ReMM-SLERP-L2-13B. This model integrates specific fine-tuning from the Kimiko-v2 LoRA, aiming to combine the strengths of both components. It is designed for general language generation tasks, leveraging its merged architecture for enhanced performance.

Loading preview...

Model Overview

Undi95/ReMM-S-Kimiko-v2-13B is a 13 billion parameter language model developed by Undi95. This model is a result of merging the Kimiko-v2-13B LoRA adapter with the ReMM-SLERP-L2-13B base model. The merge was performed with a weight of 0.50, indicating an equal contribution from both the base model and the LoRA adapter in the final architecture.

Key Characteristics

  • Merged Architecture: Combines a robust base model with a specialized LoRA for potentially improved performance in specific domains.
  • Parameter Count: Features 13 billion parameters, offering a balance between capability and computational requirements.
  • Origin: Derived from Undi95/ReMM-SLERP-L2-13B and nRuaif/Kimiko-v2-13B LoRA.

Intended Use Cases

This model is suitable for a variety of general-purpose language generation tasks where the combined strengths of its constituent models are beneficial. Developers can leverage its merged fine-tuning for applications requiring nuanced language understanding and generation, potentially excelling in areas targeted by the Kimiko-v2 LoRA.