Naphula/WBCR-SLERP-24B-v1
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 22, 2026Architecture:Transformer Cold

Naphula/WBCR-SLERP-24B-v1 is a 24 billion parameter language model based on the MistralForCausalLM architecture, featuring a 32768 token context length. This model is a multi-stage SLERP merge of several base models, including WeirdCompound, BereavedCompound, Circuitry, and Rotor. It is designed as a composite model, combining characteristics from its constituent parts through a specific merging methodology. Its primary utility lies in applications benefiting from a merged architecture derived from diverse base models.

Loading preview...