kidyu/Moza-7B-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Moza-7B-v1.0 by kidyu is a 7 billion parameter language model based on the Mistral-7B-v0.1 architecture, created using a DARE TIES merge of nine distinct pre-trained models. This merge prioritizes models like NeuralHermes, OpenOrca, and neural-chat, aiming for a versatile model. It supports an Alpaca prompt format and achieves an average score of 69.66 on the Open LLM Leaderboard, making it suitable for general conversational and reasoning tasks.

Loading preview...