vonjack/gemma2-2b-merged
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Nov 19, 2024Architecture:Transformer Warm

The vonjack/gemma2-2b-merged model is a 2.6 billion parameter language model based on the Google Gemma 2 architecture, created by merging pre-trained models using the TIES method. This model specifically integrates google/gemma-2-2b-it, making it an instruction-tuned variant of the Gemma 2-2B base. It is designed for general language understanding and generation tasks, leveraging its instruction-tuned nature for improved conversational and task-oriented performance.

Loading preview...