sabatico/dolphin-2.0-mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Cold

sabatico/dolphin-2.0-mistral-7b is a 7 billion parameter language model based on the MistralAI architecture, developed by Eric Hartford. This uncensored model is fine-tuned using a modified Dolphin dataset, an open-source implementation of Microsoft's Orca, combined with Jon Durbin's Airoboros dataset to enhance creativity. It is designed for high compliance to user requests, making it suitable for commercial and non-commercial applications where custom alignment layers can be implemented.

Loading preview...