Manolo26/metis-chat-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 24, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Manolo26/metis-chat-7b is a 7 billion parameter language model created by Manolo26, formed by merging mlabonne/NeuralBeagle14-7B and mlabonne/NeuralHermes-2.5-Mistral-7B using the slerp method. This merge combines the strengths of its base models, offering a versatile chat-optimized model with a 4096-token context length. It is designed for general conversational AI applications and text generation tasks.

Loading preview...