uukuguy/mistral-7b-platypus-fp16-dare-0.9
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Nov 20, 2023License:llama2Architecture:Transformer Open Weights Cold

The uukuguy/mistral-7b-platypus-fp16-dare-0.9 model is a 7 billion parameter Mistral-based language model, fine-tuned using the DARE (Drop and REscale) experimental method. This approach investigates the ability of larger models to tolerate a higher proportion of discarded parameters without significantly affecting capabilities. It is designed for general language understanding and generation tasks, with a focus on exploring parameter efficiency.

Loading preview...