Abe13/Full-juni-Mistral-7B-OpenOrca
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 24, 2023License:apache-2.0Architecture:Transformer Open Weights Cold
Abe13/Full-juni-Mistral-7B-OpenOrca is a 7 billion parameter Mistral-based language model fine-tuned to integrate new knowledge while preserving existing capabilities. This model focuses on enhancing understanding and performance through knowledge base updates. It is designed for applications requiring updated information without compromising established functionalities, offering a 4096-token context length.
Loading preview...