JosephusCheung/Guanaco
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 8, 2023License:gpl-3.0Architecture:Transformer0.2K Open Weights Cold

Guanaco is a 7 billion parameter instruction-following language model developed by JosephusCheung, built upon Meta's LLaMA architecture. It significantly expands on the Alpaca dataset with over 534K additional entries across multiple languages, including English, Chinese, Japanese, and German. This model excels in multilingual environments and offers advanced features like structured context handling, role-playing capabilities, and refined response rejection mechanisms. It is particularly suited for multi-turn dialogues and immersive conversational experiences across diverse linguistic backgrounds.

Loading preview...