NTQAI/chatntq-ja-7b-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 26, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NTQAI/chatntq-ja-7b-v1.0 is a 7-billion parameter, decoder-only Japanese language model developed by NTQ AI. Fine-tuned on instruction-following datasets, it is built upon the Japanese Stable LM Base Gamma 7B architecture. This model is specifically optimized for Japanese language tasks and achieves a JA MT-Bench score of 6.65, demonstrating strong performance in Japanese instruction-following compared to other models in its class.

Loading preview...