datek/google-gemma-2b-1717426780
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kArchitecture:Transformer Warm

The datek/google-gemma-2b-1717426780 model is a 2.6 billion parameter language model based on the Gemma architecture. This model is a Hugging Face Transformers model, automatically pushed to the Hub. Due to the lack of specific details in its model card, its primary differentiators and specific use cases beyond general language generation are not explicitly defined.

Loading preview...