CultriX/MoNeuTrix-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MoNeuTrix-7B-v1 is a 7 billion parameter language model developed by CultriX, created by merging three existing models: NeuralMaxime-7B-slerp, Monarch-7B, and ogno-monarch-jaskier-merge-7b, using the DARE TIES merge method. This model is designed to combine the strengths of its constituent models, offering a versatile base for various natural language processing tasks. Its merged architecture aims for balanced performance across general-purpose applications.

Loading preview...