google/gemma-3-1b-pt
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Feb 20, 2025License:gemmaArchitecture:Transformer0.2K Gated Warm
Gemma 3 1B PT is a 1 billion parameter pre-trained multimodal language model developed by Google DeepMind, built from the same research and technology as the Gemini models. It handles text and image inputs to generate text outputs, featuring a 32K token context window and multilingual support for over 140 languages. This model is well-suited for text generation and image understanding tasks like question answering, summarization, and reasoning, and is designed for deployment in resource-limited environments.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–