beomi/gemma-ko-2b
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Mar 26, 2024License:gemma-terms-of-useArchitecture:Transformer0.0K Warm

Gemma-Ko-2b is a 2.6 billion parameter, decoder-only large language model developed by Junbum Lee (Beomi) & Taekyoon Choi (Taekyoon), based on Google's Gemma architecture. This model is specifically designed for text generation tasks in both Korean and English, including question answering, summarization, and reasoning. Its relatively small size and open weights make it suitable for deployment in resource-limited environments, democratizing access to advanced AI capabilities.

Loading preview...