allura-org/Gemma-3-Glitter-4B
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Mar 26, 2025Architecture:Transformer0.0K Cold

The allura-org/Gemma-3-Glitter-4B is a 4.3 billion parameter language model based on the Gemma 3 architecture, featuring a 32768-token context length. This model utilizes the same data mix as the Glitter 12B variant, focusing on general language understanding and generation tasks. It is designed for applications requiring a balance of performance and efficiency within its parameter class.

Loading preview...