annasoli/gemma3-27b-dpo-r64-layers20-25-2ep-merged
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Jan 18, 2026Architecture:Transformer Cold

The annasoli/gemma3-27b-dpo-r64-layers20-25-2ep-merged model is a 27 billion parameter language model based on the Gemma 3 architecture. This model is a fine-tuned variant, indicated by 'dpo' (Direct Preference Optimization) and specific layer merging ('layers20-25-2ep-merged'). With a context length of 32768 tokens, it is designed for general language understanding and generation tasks, likely benefiting from its DPO fine-tuning for improved alignment and response quality.

Loading preview...