winninghealth/WiNGPT-Babel
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

WiNGPT-Babel is a 1.5 billion parameter language model developed by winninghealth, specifically customized for translation applications. Built on the Qwen2.5-1.5B architecture, it is trained with a human-in-the-loop data production strategy to provide native-level multilingual information access. This model excels at translating various content formats, including web pages, academic papers, news, and video subtitles, supporting over 20 languages with high accuracy and real-time performance.

Loading preview...