uukuguy/speechless-mistral-7b-dare-0.85
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 22, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

The uukuguy/speechless-mistral-7b-dare-0.85 is a 7 billion parameter Mistral-based language model developed by uukuguy, optimized using the DARE (Drop and REscale) method. This model retains over 97.5% of original performance while having 85% of its delta parameters set to zero, demonstrating efficient parameter utilization. It shows improved performance on MMLU and maintains strong results on ARC, HellaSwag, and Winogrande, making it suitable for general language understanding tasks where parameter efficiency is desired.

Loading preview...