Undi95/MistralMegaOrca-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Undi95/MistralMegaOrca-7B is a 7 billion parameter language model based on the Mistral-7B-v0.1 architecture, created by Undi95. This model is a merge of several fine-tuned Mistral variants, including Open-Orca/Mistral-7B-OpenOrca and jondurbin/airoboros-m-7b-3.0, using a tie-merge method. It is designed for general conversational AI tasks, leveraging the strengths of its constituent models for improved performance.

Loading preview...