CultriX/MonaTrix-v4
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 20, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MonaTrix-v4 by CultriX is a 7 billion parameter language model, created through a DARE TIES merge of NeuralMaxime-7B-slerp, ogno-monarch-jaskier-merge-7b, and dpo-binarized-NeutrixOmnibe-7B, based on Mistral-7B-v0.1. This model merge leverages specific weighting and density parameters to combine the strengths of its constituent models. It is designed for general text generation tasks, offering a balanced performance profile derived from its diverse merge components.

Loading preview...