rinna/gemma-2-baku-2b-it
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Oct 2, 2024License:gemmaArchitecture:Transformer0.0K Warm

rinna/gemma-2-baku-2b-it is a 2.6 billion parameter instruction-tuned language model developed by rinna, based on the Gemma 2 architecture. It was fine-tuned using a chat vector derived from Google's Gemma 2 models and further refined with Odds Ratio Preference Optimization (ORPO) on rinna's internal dataset. This model is designed for instruction-following tasks, adhering to the Gemma 2 chat format, and utilizes the original google/gemma-2-2b-it tokenizer.

Loading preview...