openaccess-ai-collective/DPOpenHermes-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 2, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

DPOpenHermes-7B is a 7 billion parameter language model developed by openaccess-ai-collective, fine-tuned from Teknium's OpenHermes-2.5-Mistral-7B using Direct Preference Optimization (DPO). It is optimized for multi-turn chat dialogue and structured system prompts, utilizing the ChatML format. This model is particularly suited for applications requiring robust conversational AI with explicit system instructions.

Loading preview...