BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Aug 28, 2024Architecture:Transformer0.0K Cold

BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference is a 9 billion parameter instruction-tuned causal language model developed by BAAI, based on Google's Gemma-2-9B-IT. It is fine-tuned using the Infinity-Preference dataset with the Simpo method, achieving a 73.4% LC win-rate on AlpacaEval 2.0 and 58.1% win-rate on Arena-Hard against GPT-4. This model is optimized for conversational performance and preference alignment, making it suitable for chat-based applications.

Loading preview...