uukuguy/airoboros-m-7b-3.1.2-dare-0.85
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Nov 22, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The uukuguy/airoboros-m-7b-3.1.2-dare-0.85 model is a 7 billion parameter language model based on the Airoboros-m-7b-3.1.2 architecture, fine-tuned using the DARE (Drop and REscale) method. This experimental approach selectively prunes 85% of delta parameters while retaining capabilities, making it suitable for efficient deployment. It aims to explore the effects of parameter reduction on SFT LLMs, demonstrating that larger models can tolerate significant parameter discarding.

Loading preview...