CultriX/OmniTrixAI
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
CultriX/OmniTrixAI is a 7 billion parameter language model created by CultriX, formed by merging mlabonne/NeuralBeagle14-7B, FelixChao/WestSeverus-7B-DPO-v2, and CultriX/MergeTrix-7B-v2 using the DARE TIES merge method. This model is designed for general-purpose text generation tasks, leveraging the combined strengths of its constituent models. It offers a 4096-token context length, making it suitable for a variety of conversational and content creation applications.
Loading preview...