Sela223/Repose-Marlin-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 12, 2026Architecture:Transformer0.0K Cold

Sela223/Repose-Marlin-12B is a 12 billion parameter language model created by Sela223, formed by merging UsernameJustAnother/Nemo-12B-Marlin-v8 and KatyTheCutie/Repose-V2-2B using the SLERP method. This model leverages a specific layer-wise parameter weighting strategy across attention, MLP, and normalization blocks to combine the strengths of its constituent models. It is designed for general language generation tasks, integrating diverse capabilities from its merged components.

Loading preview...