CultriX/SevereNeuralBeagleTrix-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

CultriX/SevereNeuralBeagleTrix-7B is a 7 billion parameter language model developed by CultriX, created through a DARE TIES merge of PetroGPT/WestSeverus-7B-DPO, CultriX/MergeTrix-7B-v2, and mlabonne/NeuralBeagle14-7B, based on the Mistral-7B-v0.1 architecture. This model is designed for general-purpose text generation and instruction following, leveraging the strengths of its constituent models. Its merge configuration suggests a focus on balanced performance across various tasks, making it suitable for diverse applications requiring a capable 7B model.

Loading preview...