NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v1 is a 7 billion parameter language model fine-tuned from Open-Orca/Mistral-7B-OpenOrca. This model is further instruction-tuned using the OpenAssistant/oasst_top1_2023-08-25 dataset, which includes multilingual conversational data. It is designed for general-purpose conversational AI and instruction following, supporting a 4096-token context length.

Loading preview...