NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 11, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2 is a 7 billion parameter Mistral-based causal language model developed by NickyNicky. This model is fine-tuned on the OpenAssistant/oasst_top1_2023-08-25 dataset and incorporates attention sink technology for improved generation efficiency. It is designed for general-purpose conversational AI and instruction following, supporting multilingual interactions across 20 languages.

Loading preview...