FelixChao/Severus-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 20, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

FelixChao/Severus-7B is a 7 billion parameter language model created by FelixChao, formed by merging samir-fama/FernandoGPT-v1 and FelixChao/NinjaDolphin-7B. This model leverages a unique 'passthrough' merge method across specific layer ranges of its constituent models. It is designed for general text generation tasks, offering a 4096 token context length.

Loading preview...