gcelikmasat-work/gemma-2-9b-it-BPMN
Gemma2-9B-BPMG-IT is a 9 billion parameter instruction-tuned language model, a LoRA adaptation of Google's Gemma-2-9b-it, developed by Gökberk Çelikmasat and his team. It specializes in converting natural language business process descriptions into BPMN models rendered in Graphviz DOT format. This model is optimized for generating first-draft BPMN process fragments, focusing on tasks, events, sequence flows, and AND/XOR gateways, to assist business process modelers.
Loading preview...
Overview
Gemma2-9B-BPMG-IT is a 9 billion parameter instruction-tuned language model, derived from google/gemma-2-9b-it via LoRA adaptation. Developed by Gökberk Çelikmasat, Atay Özgövde, and Fatma Başak Aydemir, this model is specifically designed to translate natural language descriptions of business processes into BPMN models, outputting them in Graphviz DOT format. It was trained on a cleaned subset of the MaD dataset and is the subject of research papers presented at PROFES 2025 and under review for Software and Systems Modeling.
Key Capabilities
- BPMN Model Generation: Converts textual business process descriptions into executable BPMN models in DOT notation.
- Focused Scope: Supports a specific subset of BPMN, including start/end events, tasks, sequence flows, and AND/XOR gateways.
- Performance: Achieves strong benchmark scores on BPMN generation, with R-GED Accuracy of 97.78% on the InstruBPM journal paper's 180-instance benchmark.
Intended Use Cases
- Accelerated Modeling: Generate initial BPMN drafts from text to speed up early-stage business process modeling.
- Assistant for Modelers: Serves as a tool for business process modelers and analysts, though human review is recommended for complex logic.
Limitations
- BPMN Subset: Does not generate pools, lanes, message flows, data objects, intermediate/boundary events, sub-processes, or annotations.
- Language: Primarily supports English input.
- Parameter Efficiency: At 9B parameters, it is larger and requires more compute than its successor,
gcelikmasat-work/Qwen3_4B_BPMN_IT, which offers comparable accuracy at roughly half the size.