The allura-org/Gemma-3-Glitter-4B is a 4.3 billion parameter language model based on the Gemma 3 architecture, featuring a 32768-token context length. This model utilizes the same data mix as the Glitter 12B variant, focusing on general language understanding and generation tasks. It is designed for applications requiring a balance of performance and efficiency within its parameter class.
No reviews yet. Be the first to review!