Undi95/MythoMax-L2-Kimiko-v2-13b
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Aug 30, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Warm

Undi95/MythoMax-L2-Kimiko-v2-13b is a 13 billion parameter language model created by Undi95, formed by merging the MythoMax-L2-13b base model with the Kimiko-v2-13B LoRA. This merged model is designed to combine the strengths of its components, offering enhanced capabilities for generative tasks. It is particularly suited for applications requiring nuanced text generation and creative content.

Loading preview...

Model Overview

Undi95/MythoMax-L2-Kimiko-v2-13b is a 13 billion parameter language model developed by Undi95. This model is a merge of two distinct components: the MythoMax-L2-13b base model and the Kimiko-v2-13B LoRA (Low-Rank Adaptation).

Key Characteristics

  • Merged Architecture: Combines a robust base model with a specialized LoRA for potentially enhanced performance in specific domains.
  • Parameter Count: Features 13 billion parameters, placing it in the medium-sized category for large language models, balancing performance with computational efficiency.
  • Origin: The base model, MythoMax-L2-13b, is available on Hugging Face, as is the Kimiko-v2-13B LoRA, indicating a community-driven or open-source lineage for its components.
  • Merging Weight: The LoRA was merged with a weight of 0.50, suggesting a balanced integration of the LoRA's learned features with the base model's existing knowledge.

Potential Use Cases

This model is likely suitable for applications that benefit from the combined strengths of its constituent models. Given the nature of LoRA merges, it may excel in:

  • Creative Text Generation: Generating diverse and imaginative content.
  • Role-playing Scenarios: Producing coherent and contextually relevant dialogue.
  • Storytelling: Crafting narratives with specific stylistic elements introduced by the LoRA.
  • Fine-tuned Responses: Providing more nuanced or specialized outputs compared to the base model alone.
Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p