JosephusCheung/Guanaco
Guanaco is a 7 billion parameter instruction-following language model developed by JosephusCheung, built upon Meta's LLaMA architecture. It significantly expands on the Alpaca dataset with over 534K additional entries across multiple languages, including English, Chinese, Japanese, and German. This model excels in multilingual environments and offers advanced features like structured context handling, role-playing capabilities, and refined response rejection mechanisms. It is particularly suited for multi-turn dialogues and immersive conversational experiences across diverse linguistic backgrounds.
Loading preview...
Guanaco: Multilingual Instruction-Following LLaMA Model
Guanaco is a 7 billion parameter instruction-following language model based on Meta's LLaMA architecture, developed by JosephusCheung. It significantly enhances the original Alpaca dataset by incorporating over 534,000 new entries, expanding its linguistic coverage to include English, Simplified Chinese, Traditional Chinese (Taiwan and Hong Kong), Japanese, and German. This extensive multilingual dataset enables Guanaco to perform exceptionally well in diverse language environments.
Key Capabilities & Features
- Multilingual Proficiency: Trained on a vast dataset covering multiple languages, making it highly effective for global applications.
- Improved Context Handling: Utilizes a structured prompt format similar to ChatGPT, allowing for better integration with Alpaca and enhanced multi-turn dialogue capabilities.
- Advanced Role-Playing: Supports immersive role-playing in English, Chinese, Japanese, and German, enabling the model to assume specific roles, historical figures, or fictional characters with consistent persona maintenance.
- Refined Response Rejection: Incorporates reserved keywords (NO IDEA, FORBIDDEN, SFW) to clearly communicate when it lacks knowledge, refuses to answer due to ethical concerns, or filters NSFW content.
- Continued Conversations: Designed to maintain context and continue discussions on ongoing topics, providing more coherent and adaptable responses.
Use Cases & Considerations
Guanaco is ideal for applications requiring robust multilingual instruction-following, engaging multi-turn conversations, and dynamic role-playing. While it offers advanced features, users should be aware that as a 7B-parameter model, knowledge-based content may be inaccurate. It is strongly recommended to provide verifiable sources in system prompts for factual accuracy and to inform users of this limitation to prevent misinformation.