Overview
NTQAI/chatntq-ja-7b-v1.0 is a 7-billion parameter, decoder-only Japanese language model developed by NTQ AI. It is fine-tuned on proprietary instruction-following datasets, leveraging the robust foundation of the Japanese Stable LM Base Gamma 7B base model. The model is designed to excel in understanding and generating Japanese text based on instructions.
Key Capabilities
- Japanese Instruction Following: Specifically fine-tuned for Japanese language tasks, enabling it to respond effectively to instructions in Japanese.
- Competitive Performance: Achieves a JA MT-Bench score of 6.65, positioning it favorably against other Japanese language models in its size class, as evaluated by
gpt-4-0613. - Decoder-Only Architecture: Based on a decoder-only transformer architecture, suitable for generative tasks.
Performance Benchmarks
The model's performance is primarily evaluated using Stability AI Japan's Japanese MT-Bench. With a score of 6.65, it outperforms several other 7B and 13B parameter Japanese models, including shisa-gamma-7b-v1 (6.12) and ELYZA-japanese-Llama-2-7b-fast-instruct (4.86). The evaluation utilized a Japanese prompt and --num-choices 4 for a more representative assessment of its capabilities.
Usage
This model requires Transformers 4.34.0 or newer. Example Python code is provided for loading the model and tokenizer, constructing prompts with a system message, and generating responses using torch and transformers libraries.