Eric111/MistInst-v0.2_ochat-3.5-0106_dpo-binarized-NeuralTrix-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 6, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Eric111/MistInst-v0.2_ochat-3.5-0106_dpo-binarized-NeuralTrix-7B is a 7 billion parameter language model created by Eric111, resulting from a slerp merge of Mistral-7B-Instruct-v0.2_openchat-3.5-0106 and dpo-binarized-NeuralTrix-7B. This model combines the strengths of its base components, leveraging a 4096 token context length. It is designed for general language generation and instruction-following tasks, benefiting from the merged architectures.

Loading preview...