CultriX/MergeCeption-7B-v3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 13, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

CultriX/MergeCeption-7B-v3 is a 7 billion parameter language model created by CultriX, developed through a DARE TIES merge of several specialized models including Kukedlc/NeuralMaxime-7B-slerp, mlabonne/Monarch-7B, and CultriX/NeuralTrix-bf16. This merged model leverages the strengths of its constituent components to offer a versatile language generation capability. It is designed for general text generation tasks, benefiting from the combined knowledge and styles of its merged predecessors.

Loading preview...