Undi95/Nete-13B: A Blended 13B Language Model
Nete-13B is a 13 billion parameter language model developed by Undi95, representing an enhanced iteration of the Xwin-MLewd-13B recipe. This model integrates components from several other 13B models and LoRAs, aiming to achieve improved performance through a diverse blend of fine-tuned elements.
Key Components and Influences
Nete-13B's development involved merging aspects from various models and LoRAs, including:
- Undi95/Mlewd-v2.4-13B
- Xwin-LM/Xwin-LM-13B-V0.2
- cgato/Thespis-13b-v0.4
- Undi95/PsyMedRP-v1-13B
- Undi95/Storytelling-v2.1-13B-lora
- lemonilia/LimaRP-Llama2-13B-v3-EXPERIMENT
This blending approach suggests an effort to combine strengths from different specialized models, potentially leading to a more versatile and robust language generation capability. The model utilizes the Alpaca prompt template for instruction-following tasks.
Intended Use Cases
Given its foundation and the models it incorporates, Nete-13B is suitable for a range of general-purpose text generation and instruction-following applications, particularly where a blend of creative and structured response capabilities is beneficial.