annasoli/gemma3-27b-dpo-r64-layers30-35-2ep-merged
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Jan 18, 2026Architecture:Transformer Cold

The annasoli/gemma3-27b-dpo-r64-layers30-35-2ep-merged model is a 27 billion parameter language model based on the Gemma 3 architecture. This model is a fine-tuned variant, likely optimized for specific tasks through Direct Preference Optimization (DPO) and merged layers. Its large parameter count and 32K context length suggest capabilities for complex language understanding and generation tasks.

Loading preview...