arcee-ai/SEC-MBX-7B-DPO
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 31, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

arcee-ai/SEC-MBX-7B-DPO is a 7 billion parameter language model created by arcee-ai, formed by merging arcee-ai/sec-mistral-7b-instruct-1.2-epoch and macadeliccc/MBX-7B-v3-DPO. This model leverages a Mistral-based architecture with a 4096-token context length, optimized through a DPO merge process. It is designed for general language understanding and generation tasks, combining the strengths of its constituent models.

Loading preview...