monology/mixtral-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 27, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

monology/mixtral-ties is an experimental 7 billion parameter language model based on the Mistral-7B-v0.1 architecture, created by monology. This model is a merge of eight different Mixtral-slerp models using the TIES merge method, aiming to combine their capabilities. It is intended for experimental purposes to explore the effects of model merging.

Loading preview...