exonics/gemma_absa_en_yeni1

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Mar 1, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The exonics/gemma_absa_en_yeni1 is a 9 billion parameter Gemma2 model developed by exonics, fine-tuned from ytu-ce-cosmos/Turkish-Gemma-9b-v0.1. This model was trained with a focus on efficiency, utilizing Unsloth and Huggingface's TRL library to achieve 2x faster training. Its primary differentiator is the optimized training process, making it suitable for applications requiring efficient fine-tuning of large language models.

Loading preview...

Overview

The exonics/gemma_absa_en_yeni1 is a 9 billion parameter Gemma2 model developed by exonics. It was fine-tuned from the ytu-ce-cosmos/Turkish-Gemma-9b-v0.1 base model, indicating a potential specialization or adaptation from a Turkish-language foundation.

Key Characteristics

  • Model Family: Gemma2
  • Parameter Count: 9 billion parameters
  • Developer: exonics
  • Training Efficiency: This model was trained significantly faster, achieving 2x speed improvements by leveraging Unsloth and Huggingface's TRL library. This highlights an optimization in the fine-tuning process.
  • License: Released under the Apache-2.0 license, allowing for broad use and distribution.

Potential Use Cases

Given its efficient fine-tuning and Gemma2 architecture, this model could be particularly useful for:

  • Rapid Prototyping: Developers can quickly adapt and deploy this model for various NLP tasks due to its optimized training.
  • Resource-Constrained Environments: The efficiency gains from Unsloth suggest it might be suitable for environments where training time or computational resources are a concern.
  • Further Fine-tuning: Its foundation and efficient training make it a good candidate for additional domain-specific fine-tuning or task adaptation.