alibayram/gemma3-27b-multi-turn
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 12, 2026Architecture:Transformer Cold
The alibayram/gemma3-27b-multi-turn model is a 27 billion parameter language model fine-tuned by alibayram using TRL. This model is specifically optimized for multi-turn conversational interactions, building upon the Gemma 3 architecture. It is designed to generate coherent and contextually relevant responses in ongoing dialogues, making it suitable for advanced conversational AI applications.
Loading preview...