openaccess-ai-collective/DPOpenHermes-7B-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 6, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
DPOpenHermes-7B-v2 is a 7 billion parameter Mistral-7B based language model developed by openaccess-ai-collective, fine-tuned using Direct Preference Optimization (DPO) on the Intel/orca_dpo_pairs and allenai/ultrafeedback_binarized_cleaned preference datasets. This version addresses data contamination issues present in its predecessor, focusing on improved instruction following and multi-turn chat dialogue. It utilizes the ChatML prompt format, making it compatible with OpenAI API structures and optimized for system prompts.
Loading preview...