flemmingmiguel/DareBeagle-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 16, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
flemmingmiguel/DareBeagle-7B is a 7 billion parameter language model, created by flemmingmiguel, resulting from a Slerp merge of mlabonne/NeuralBeagle14-7B and mlabonne/NeuralDaredevil-7B. This experimental merge combines two DPO-tuned models with distinct characteristics to evaluate the preservation and improvement of capabilities. It is designed for further fine-tuning experiments to identify optimal base merges for various tasks.
Loading preview...