dphn/dolphin-2.6-mistral-7b-dpo
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 31, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

Dolphin 2.6 Mistral 7b DPO is a 7 billion parameter language model developed by dphn, based on the Mistral-7b architecture with a 4096 token context length. This model is specifically DPO-tuned for high compliance and excels at coding tasks, having been trained with extensive coding data. It is designed to be uncensored and highly obedient to user requests, making it suitable for applications requiring direct and unfiltered responses.

Loading preview...