viethq188/Rabbit-7B-DPO-Chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 12, 2023License:apache-2.0Architecture:Transformer Open Weights Cold
viethq188/Rabbit-7B-DPO-Chat is a 7 billion parameter language model created by viethq188, developed by merging AIDC-ai-business/Marcoroni-7B-v3 and rwitz/go-bruins-v2 using a slerp merge method. This model was subsequently fine-tuned with DPO (Direct Preference Optimization) using Hugging Face data, resulting in a model optimized for chat-based interactions. It supports an 8192-token context length and is designed for general conversational applications.
Loading preview...