Pearax/Stable-Platypus2-13B-LIMARP
Pearax/Stable-Platypus2-13B-LIMARP is a 13 billion parameter auto-regressive language model based on the LLaMA 2 transformer architecture. It is a merge of garage-bAInd/Stable-Platypus2-13B and lemonilia/limarp-llama2, designed for general language tasks. The base Stable-Platypus2-13B was instruction fine-tuned using LoRA on STEM and logic-based datasets, suggesting a focus on reasoning capabilities. This model is suitable for applications requiring a robust 13B parameter English language model.
Loading preview...
Model Overview
Pearax/Stable-Platypus2-13B-LIMARP is a 13 billion parameter language model built upon the LLaMA 2 transformer architecture. This model is a strategic merge of two distinct base models: garage-bAInd/Stable-Platypus2-13B and lemonilia/limarp-llama2. The Stable-Platypus2-13B component itself is a merge of garage-bAInd/Platypus2-13B and stabilityai/StableBeluga-13B.
Key Characteristics
- Architecture: Based on the LLaMA 2 transformer architecture, providing a strong foundation for language understanding and generation.
- Training: The
Platypus2-13Bcomponent was trained by Cole Hunter & Ariel Lee, whileStableBeluga-13Bwas trained by StabilityAI. TheStable-Platypus2-13Bwas instruction fine-tuned using LoRA, leveraging STEM and logic-based datasets likegarage-bAInd/Open-Platypus. - Language: Primarily English.
- Prompt Format: Utilizes a specific instruction-response format:
### Instruction:\n<prompt>\n### Response:.
Intended Use Cases
This model is well-suited for developers and researchers looking for a 13B parameter English language model that benefits from the combined strengths of its merged predecessors. Its fine-tuning on STEM and logic-based datasets suggests potential strengths in:
- Reasoning tasks
- Problem-solving scenarios
- General instruction-following applications
Limitations
As with all LLMs, this model carries inherent risks, including the potential for inaccurate, biased, or objectionable outputs. Users are advised to perform safety testing and tuning tailored to their specific applications, especially given the non-commercial license for the Platypus2-13B base weights.