webbigdata/ALMA-7B-Ja
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 7, 2023Architecture:Transformer0.0K Cold

ALMA-7B-Ja is a 7 billion parameter machine translation model developed by webbigdata, based on the ALMA learning method. This model specializes in Japanese to English translation, replacing Russian support found in the original ALMA-7B. It leverages a two-step fine-tuning process on monolingual and high-quality parallel data to achieve strong translation performance, particularly for Japanese and English language pairs.

Loading preview...