SillyTilly/google-gemma-2-27b-it
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Jun 27, 2024License:gemmaArchitecture:Transformer Cold

SillyTilly/google-gemma-2-27b-it is a 27 billion parameter instruction-tuned decoder-only large language model developed by Google. Built from the same research as Gemini models, it is designed for a variety of text generation tasks including question answering, summarization, and reasoning. This model offers open weights and is optimized for deployment in resource-limited environments, democratizing access to advanced AI capabilities.

Loading preview...