vishnukv/newmerge
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 5, 2024License:mitArchitecture:Transformer Open Weights Cold

vishnukv/newmerge is a 7 billion parameter language model created by vishnukv, resulting from a merge of pre-trained models using the DARE TIES method. It is based on uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b and incorporates PetroGPT/WestSeverus-7B-DPO and vishnukv/WestSeverusJaskier. This model is designed to combine the strengths of its constituent models for general language generation tasks.

Loading preview...