mrfakename/NeuralOrca-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 1, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NeuralOrca-7B-v1 by mrfakename is an experimental 7 billion parameter "Frankenmerge" model, combining mlabonne/NeuralHermes-2.5-Mistral-7B and Open-Orca/Mistral-7B-OpenOrca. This model is instruction-tuned and utilizes the ChatML prompt format, offering an extended context length of 8192 tokens. It is designed for general conversational AI tasks, with initial evaluations showing an average performance of 67.64 on the Open LLM Leaderboard.

Loading preview...