echodpp/gemma-2-2b
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Mar 23, 2026License:gemmaArchitecture:Transformer Warm
Gemma 2 2B is a 2.6 billion parameter, decoder-only, text-to-text large language model developed by Google, built from the same research and technology as the Gemini models. Trained on 2 trillion tokens, it offers open weights and is optimized for a variety of text generation tasks including question answering, summarization, and reasoning. Its compact size and efficient design make it suitable for deployment in resource-limited environments like laptops and desktops, democratizing access to advanced AI capabilities.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–