Eric111/Snorkel-Mistral-PairRM-DPO-openchat-3.5-0106-laser
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 3, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Eric111/Snorkel-Mistral-PairRM-DPO-openchat-3.5-0106-laser is a 7 billion parameter language model created by Eric111, formed by merging Snorkel-Mistral-PairRM-DPO and openchat-3.5-0106-laser using mergekit. This model leverages the strengths of its constituent models, combining a DPO-trained Mistral variant with an OpenChat iteration. It is designed for general-purpose language tasks, offering a blend of capabilities from its merged components.
Loading preview...