NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 13, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 is a 7 billion parameter causal language model fine-tuned from Open-Orca/Mistral-7B-OpenOrca. This model leverages the OpenAssistant/oasst_top1_2023-08-25 dataset for instruction tuning and incorporates attention sink mechanisms for potentially improved long-context handling. It is designed for general-purpose conversational AI and instruction-following tasks, with a context length of 4096 tokens.

Loading preview...