NicolasRodriguez/manaba_gemma_2_2b
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Dec 3, 2025Architecture:Transformer Warm

NicolasRodriguez/manaba_gemma_2_2b is a 2.6 billion parameter decoder-only large language model, fine-tuned in Spanish based on Google's Gemma 2 2B architecture. This model is designed for text generation tasks, offering capabilities in question answering, summarization, and reasoning, while adhering to Google's safety and integrity policies. Its relatively small size makes it suitable for deployment in resource-limited environments.

Loading preview...