lemon-mint/gemma-ko-1.1-2b-it
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kLicense:gemmaArchitecture:Transformer0.0K Warm

lemon-mint/gemma-ko-1.1-2b-it is a 2.6 billion parameter instruction-tuned language model, created by lemon-mint, that merges Google's Gemma 1.1-2b-it and Gemma 2b with beomi/gemma-ko-2b. This model, built using the SLERP merge method, is specifically designed to enhance performance, particularly in Korean language tasks, by combining the strengths of its constituent Gemma models. It offers a context length of 8192 tokens, making it suitable for applications requiring robust language understanding and generation.

Loading preview...