OpenLLM-Ro/RoGemma2-9b-Instruct-DPO-2024-10-09
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Oct 10, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

OpenLLM-Ro/RoGemma2-9b-Instruct-DPO-2024-10-09 is a 9 billion parameter instruction-tuned generative text model developed by OpenLLM-Ro, specifically aligned for the Romanian language. This model is part of the RoGemma2 family, fine-tuned using Direct Preference Optimization (DPO) from the RoGemma2-9b-Instruct-2024-10-09 base model. It excels in Romanian-specific natural language tasks, demonstrating strong performance in academic benchmarks like LaRoSeDa and XQuAD, and is intended for assistant-like chat applications in Romanian.

Loading preview...