Undi95/MXLewdMini-L2-13B

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Undi95/MXLewdMini-L2-13B is a 13 billion parameter language model created by Undi95, designed as a merge of several Llama 2-based models. It combines elements from Undi95/MLewd-L2-13B-v2-3, Undi95/ReMM-v2.1-L2-13B, and Xwin-LM/Xwin-LM-13B-V0.1. This model aims to replicate the performance of larger models like MXLewd-L2-20B in a more compact 13B parameter size, making it suitable for applications requiring a balance of capability and efficiency.

Loading preview...

Overview

Undi95/MXLewdMini-L2-13B is a 13 billion parameter language model developed by Undi95. It is a merged model, specifically designed to achieve capabilities similar to the larger MXLewd-L2-20B but within a more efficient 13B parameter footprint. The model was constructed without using merge interlacing, focusing on a direct combination of its constituent parts.

Key Components

This model is a blend of three distinct Llama 2-based models:

  • Undi95/MLewd-L2-13B-v2-3
  • Undi95/ReMM-v2.1-L2-13B
  • Xwin-LM/Xwin-LM-13B-V0.1

The merging process involved combining these models in specific ratios: one part consists of ReMM (0.33) and Xwin (0.66), while the other part is composed of Xwin (0.33) and MLewd (0.66).

Prompt Format

The model utilizes an Alpaca prompt template, which is a common and straightforward format for instruction-following tasks. Users should structure their inputs as follows:

Below is an instruction that describes a task. Write a response that completes the request.

### Instruction:
{prompt}

### Response:

Use Cases

Given its 13B parameter size and the blend of its base models, MXLewdMini-L2-13B is suitable for general-purpose language generation and instruction-following tasks where a balance between model size and performance is desired. Its design suggests potential for applications that benefit from the characteristics of its merged components.