CultriX/NeuralTrix-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 8, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NeuralTrix-7B-v1 is a 7 billion parameter language model developed by CultriX, created by merging OmniBeagle-7B, MBX-7B-v3, and AiMaven-Prometheus using the DARE TIES method. Built upon the Mistral-7B-v0.1 base, this model leverages a 4096-token context length. It is designed to combine the strengths of its constituent models for enhanced general-purpose text generation.

Loading preview...