exonics/gemma_absa_en_yeni1
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Mar 1, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The exonics/gemma_absa_en_yeni1 is a 9 billion parameter Gemma2 model developed by exonics, fine-tuned from ytu-ce-cosmos/Turkish-Gemma-9b-v0.1. This model was trained with a focus on efficiency, utilizing Unsloth and Huggingface's TRL library to achieve 2x faster training. Its primary differentiator is the optimized training process, making it suitable for applications requiring efficient fine-tuning of large language models.

Loading preview...