Eric111/Mayo
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 4, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Eric111/Mayo is a 7 billion parameter language model created by Eric111, formed by merging mlabonne/NeuralBeagle14-7B and openchat/openchat-3.5-0106 using the slerp merge method. This model leverages the strengths of its constituent models to offer a balanced performance profile. It is suitable for general-purpose language generation tasks within its 4096-token context window.
Loading preview...