maxcurrent/Wiz2Beagle-7b-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Wiz2Beagle-7b-v1 is a 7 billion parameter language model developed by maxcurrent, created by merging amazingvince/Not-WizardLM-2-7B and mlabonne/NeuralBeagle14-7B using the VortexMerge kit. This model leverages a ties-based merge method to combine the strengths of its constituent models, offering a unique blend of their capabilities. It is designed for general language tasks, benefiting from the diverse training data of its merged components. The model has a context length of 4096 tokens.

Loading preview...