Kukedlc/MyModelsMerge-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 24, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Kukedlc/MyModelsMerge-7b is a 7 billion parameter language model created by Kukedlc, resulting from a merge of eight distinct models using the LazyMergekit tool. This merge incorporates models like liminerity/M7-7b, Kukedlc/Neural4gsm8k, and several other Kukedlc-developed models, utilizing a DARE TIES merge method. The model is configured with specific weights and densities for each component, aiming to combine their strengths. It is designed for general language generation tasks, leveraging the combined capabilities of its constituent models.
Loading preview...