viethq188/Rabbit-7B-v2-DPO-Chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 12, 2023License:apache-2.0Architecture:Transformer Open Weights Cold

viethq188/Rabbit-7B-v2-DPO-Chat is a 7 billion parameter language model created by viethq188, built by merging AIDC-ai-business/Marcoroni-7B-v3 and Q-bert/MetaMath-Cybertron-Starling using a slerp merge method. This model was subsequently fine-tuned with DPO (Direct Preference Optimization) using Hugging Face data. It is designed for chat-based applications, leveraging its merged architecture and DPO training for improved conversational performance.

Loading preview...