dphn/dolphin-2.0-mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 2, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Cold
Dolphin-2.0-mistral-7b is a 7 billion parameter language model developed by Eric Hartford, based on the MistralAI architecture with a 4096-token context length. It is fine-tuned on a modified Dolphin dataset, an open-source implementation of Microsoft's Orca, and includes the Airoboros dataset to enhance creativity. This model is uncensored and designed for high compliance to user requests, making it suitable for commercial and non-commercial applications where an alignment layer can be implemented by the user. It achieves an average score of 58.58 on the Open LLM Leaderboard, with notable performance in HellaSwag (80.26) and Winogrande (75.37).
Loading preview...