beomi/gemma-mling-7b
TEXT GENERATIONConcurrency Cost:1Model Size:8.5BQuant:FP8Ctx Length:8kPublished:Apr 15, 2024License:gemma-terms-of-useArchitecture:Transformer0.0K Cold

beomi/gemma-mling-7b is an 8.5 billion parameter Gemma-based causal language model, continually pretrained by Junbum Lee and Taekyoon Choi. It is specifically optimized for multilingual performance, focusing on Korean, English, Chinese, and Japanese, alongside a 500-language multilingual corpus. This model excels at generating text in these languages, making it suitable for diverse global applications requiring strong multilingual understanding and generation capabilities.

Loading preview...