FelixChao/WestSeverus-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 23, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

FelixChao/WestSeverus-7B is a 7 billion parameter language model, created by merging senseable/WestLake-7B-v2 and FelixChao/Severus-7B using a slerp merge method. This model leverages the strengths of its constituent models, offering a balanced performance profile for general text generation tasks. With a 4096-token context length, it is suitable for applications requiring moderate context understanding and generation.

Loading preview...