Pearax/Stable-Platypus2-13B-LIMARP
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Cold

Pearax/Stable-Platypus2-13B-LIMARP is a 13 billion parameter auto-regressive language model based on the LLaMA 2 transformer architecture. It is a merge of garage-bAInd/Stable-Platypus2-13B and lemonilia/limarp-llama2, designed for general language tasks. The base Stable-Platypus2-13B was instruction fine-tuned using LoRA on STEM and logic-based datasets, suggesting a focus on reasoning capabilities. This model is suitable for applications requiring a robust 13B parameter English language model.

Loading preview...