google/gemma-1.1-2b-it
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Mar 26, 2024License:gemmaArchitecture:Transformer0.2K Gated Loading
google/gemma-1.1-2b-it is a 2.6 billion parameter instruction-tuned decoder-only large language model developed by Google, part of the Gemma family built from the same research as Gemini models. This updated version, Gemma 1.1, incorporates a novel RLHF method, leading to substantial gains in quality, coding capabilities, factuality, instruction following, and multi-turn conversation. Its relatively small size and optimized performance make it suitable for deployment in resource-limited environments for various text generation tasks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–