ganchengguang/Yoko-7B-Japanese-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:mitArchitecture:Transformer0.0K Open Weights Cold

ganchengguang/Yoko-7B-Japanese-v1 is a 7 billion parameter language model fine-tuned from LLaMA2-7B, developed with contributions from Yokohama National University Mori Lab. This model is specifically optimized for improved performance in Chinese and Japanese, leveraging the Guanaco dataset for training. It is designed for chat and non-chat applications, offering enhanced linguistic capabilities for East Asian languages within a 4096-token context window.

Loading preview...