rinna/gemma-2-baku-2b-it is a 2.6 billion parameter instruction-tuned language model developed by rinna, based on the Gemma 2 architecture. It was fine-tuned using a chat vector derived from Google's Gemma 2 models and further refined with Odds Ratio Preference Optimization (ORPO) on rinna's internal dataset. This model is designed for instruction-following tasks, adhering to the Gemma 2 chat format, and utilizes the original google/gemma-2-2b-it tokenizer.
No reviews yet. Be the first to review!