maywell/Synatra-7B-v0.3-Translation
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 17, 2023License:cc-by-sa-4.0Architecture:Transformer0.0K Open Weights Cold

Synatra-7B-v0.3-Translation is a 7 billion parameter language model developed by maywell, fine-tuned from Mistral-7B-Instruct-v0.1. This model specializes in high-quality translation tasks, particularly between Korean and English, leveraging a filtered version of the sharegpt_deepl_ko_translation dataset. It is optimized for accurate and contextually appropriate language translation, making it suitable for applications requiring robust multilingual communication.

Loading preview...