uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Nov 22, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85 is a 7 billion parameter Mistral-based language model developed by uukuguy, utilizing the DARE (Drop and REscale) experimental technique. This model explores the impact of setting a high proportion of delta parameters to zero, specifically with a 0.85 weight mask rate. It aims to maintain capabilities while potentially reducing model complexity, offering a unique approach to model optimization.

Loading preview...