shadowml/Daredevil-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 6, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Daredevil-7B is a 7 billion parameter language model created by shadowml, built upon the Mistral-7B-v0.1 architecture. This model is a merge of SamirGPT-v1, Slerp-CM-mist-dpo, and Mistral-7B-Merge-14-v0.2, utilizing the dare_ties merge method. It achieves an average score of 73.36 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. Daredevil-7B is suitable for general-purpose text generation and conversational AI applications.
Loading preview...