Siliconic/raven-x-1.1
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold
Siliconic/raven-x-1.1 is a 13 billion parameter language model developed by Siliconic Technologies, based on a modified Vicuna-13B-delta and LLaMA architecture. This model has undergone delta model upgradation and 32-bit quantization, trained on a diverse dataset including Oasst, ChatGPT, ShareGPT, and Wikipedia. It is specifically designed as a custom model for Raven AI, focusing on general conversational and knowledge-based tasks.
Loading preview...
Raven-X-1.1 Model Overview
Raven-X-1.1 is a 13 billion parameter language model developed by Siliconic Technologies, specifically for the Raven AI system. This model is an upgraded version of the raven-x-001, featuring delta model upgradation and 32-bit quantization for improved performance.
Key Capabilities
- Architecture: Built upon a modified version of the Vicuna-13B-delta and LLaMA models.
- Training Data: Fine-tuned on a comprehensive dataset including Oasst, ChatGPT, ShareGPT, and Wikipedia, enhancing its general knowledge and conversational abilities.
- Customization: Designed as a custom solution for Raven AI, indicating specialized optimization for its intended applications.
- Development: Created and fine-tuned by Akshit Kumar, highlighting individual expertise in its development.
Good For
- General Conversational AI: Its training on diverse chat and knowledge datasets makes it suitable for engaging in broad conversational tasks.
- Knowledge-based Applications: The inclusion of Wikipedia in its training data suggests proficiency in answering factual queries and providing information.
- Integration into Raven AI System: Optimized for use within the Raven AI ecosystem, potentially offering seamless performance for applications built on that platform.