Model Overview
CalderaAI/13B-Theseus-MK1 is a 13 billion parameter language model developed by CalderaAI, representing a research artifact in advanced model merging. It was constructed using a Spherical Linear Interpolation (SLERP) technique, combining four highly competent base models: nous-hermesv2, chronosv2, platypusv2, and airborosv2. This "MK1" release marks an initial exploration into the intended capabilities of the Theseus model line, focusing on demonstrating the efficacy of SLERP merges across multiple strong foundational models.
Key Capabilities
- High Competency: Designed to exhibit strong performance across various tasks.
- Instruction Following: Tailored to follow Alpaca instruct directives precisely, including assumed context when none is explicitly given.
- Minimal Censorship: Intended to operate with minimal to no inherent censorship, offering a broader range of responses.
- Research Focus: Primarily serves as a research artifact to observe the results of complex SLERP merging techniques.
Good For
- Research and Development: Ideal for researchers interested in advanced model merging techniques like SLERP.
- Exploration of Unfiltered Responses: Suitable for use cases requiring models with minimal inherent content restrictions.
- High-Precision Instruction Following: Effective for applications where exact adherence to instructions and context emulation is critical.