abeja/ABEJA-Qwen2.5-7b-Japanese-v0.1

Warm
Public
7.6B
FP8
32768
1
Mar 12, 2025
License: apache-2.0
Hugging Face

ABEJA-Qwen2.5-7b-Japanese-v0.1 is a 7.6 billion parameter language model developed by ABEJA, based on Qwen/Qwen2.5-7B-Instruct. This model was trained using distillation from abeja/ABEJA-Qwen2.5-32b-Japanese-v0.1, focusing on Japanese language capabilities. It enhances instruction-following performance through ChatVector, making it suitable for Japanese-centric conversational AI applications.

No reviews yet. Be the first to review!