CalderaAI/13B-Theseus-MK1
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Cold

CalderaAI/13B-Theseus-MK1 is a 13 billion parameter language model developed by CalderaAI, created through a Spherical Linear Interpolation (SLERP) merge of nous-hermesv2, chronosv2, platypusv2, and airborosv2. This research artifact is designed for high competency and minimal censorship, excelling at precise behavior emulation based on Alpaca instruct directives. It serves as a demonstration of advanced merging techniques for creating capable and adaptable models.

Loading preview...