stabilityai/japanese-stablelm-base-beta-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 30, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

Japanese-StableLM-Base-Beta-7B is a 7-billion parameter decoder-only language model developed by Stability AI Japan, based on the Llama-2 architecture with a 4096-token context length. It is specifically fine-tuned on a diverse collection of Japanese data to maximize performance on Japanese language tasks. This model serves as a foundational base for application-specific fine-tuning, offering strong capabilities for Japanese text generation and understanding.

Loading preview...