alibayram/gemma3-27b-txt-comp
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The alibayram/gemma3-27b-txt-comp is a 27 billion parameter Gemma3 model developed by alibayram, fine-tuned from alibayram/tr-gemma-128k-27b. This model was trained significantly faster using Unsloth and Huggingface's TRL library, offering efficient performance for text-based tasks. It is designed for applications requiring a large language model with optimized training efficiency.

Loading preview...