stabilityai/japanese-stablelm-instruct-beta-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 30, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold
The stabilityai/japanese-stablelm-instruct-beta-7b is a 7 billion parameter decoder-only language model developed by Stability AI Japan, based on the Llama2 transformer architecture. It is fine-tuned on Japanese translated versions of Databricks Dolly-15k, Anthropic HH, and other public datasets. This model is specifically designed for instruction-following tasks in Japanese, offering a specialized solution for applications requiring high-quality Japanese language generation and understanding.
Loading preview...