hakutaku/qwen2.5-ja-zh
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Sep 19, 2024Architecture:Transformer0.0K Cold

hakutaku/qwen2.5-ja-zh is a 7.6 billion parameter language model based on Qwen2.5-7B-Instruct, specifically fine-tuned for high-quality translation from Japanese to Chinese. With a context length of 32768 tokens, this model is optimized for accurate and fluent cross-lingual conversion, making it ideal for applications requiring robust Japanese-to-Chinese translation capabilities.

Loading preview...