Manolo26/metis-chat-instruct-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 31, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Manolo26/metis-chat-instruct-7b is a 7 billion parameter instruction-tuned language model created by Manolo26, formed by merging mlabonne/NeuralBeagle14-7B and mlabonne/NeuralMarcoro14-7B using a slerp merge method. This model is designed for chat-based interactions and general instruction following, leveraging the combined strengths of its constituent models. It offers a 4096-token context length, making it suitable for conversational AI applications.

Loading preview...