timonziegenbein/gemma-2-9b-alpaca
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Mar 24, 2025Architecture:Transformer0.0K Cold

The timonziegenbein/gemma-2-9b-alpaca model is a 9 billion parameter language model with a 16384 token context length. This model is based on the Gemma 2 architecture and is fine-tuned for instruction following, making it suitable for a variety of general-purpose conversational AI tasks. Its design aims to provide robust performance for applications requiring understanding and generation of human-like text.

Loading preview...