rinna/youri-7b-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 30, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

rinna/youri-7b-chat is a 7 billion parameter instruction-tuned causal language model developed by rinna, based on the Llama 2 architecture with a 4096-hidden-size transformer. It is specifically fine-tuned for chat-style interactions using a diverse set of English and Japanese instruction datasets, including Databricks Dolly, Anthropic HH RLHF, and FLAN. This model excels at understanding and responding to natural language instructions in both English and Japanese, making it suitable for conversational AI applications requiring strong bilingual capabilities.

Loading preview...