akera/translategemma-12b-ug40-sft-combined-merged

VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 15, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The akera/translategemma-12b-ug40-sft-combined-merged is a 12 billion parameter Gemma 3 model, developed by akera and fine-tuned from Sunbird/translategemma-12b-ug40. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. It is designed for general language tasks, leveraging its large parameter count and efficient training methodology.

Loading preview...

Model Overview

The akera/translategemma-12b-ug40-sft-combined-merged is a 12 billion parameter language model, developed by akera. It is based on the Gemma 3 architecture and was fine-tuned from the Sunbird/translategemma-12b-ug40 model.

Key Characteristics

  • Architecture: Gemma 3 model with 12 billion parameters.
  • Training Efficiency: This model was fine-tuned using Unsloth and Huggingface's TRL library, resulting in a 2x faster training process compared to standard methods.
  • License: Distributed under the Apache-2.0 license.

Intended Use Cases

This model is suitable for a variety of general language generation and understanding tasks, benefiting from its substantial parameter count and optimized training. Its efficient development process suggests a focus on practical application and performance.