AliMertTemizsoy/bilsem-gemma-3-12b-all-configs-sft-111
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 4, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The AliMertTemizsoy/bilsem-gemma-3-12b-all-configs-sft-111 is a 12 billion parameter Gemma 3 model, fine-tuned by AliMertTemizsoy. This model was trained using Unsloth and Huggingface's TRL library, enabling faster fine-tuning. It is designed for general language tasks, leveraging its Gemma 3 architecture and 32768 token context length.

Loading preview...