chenyitian-shanshu/SIRL-Gurobi32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Sep 9, 2025License:mitArchitecture:Transformer Open Weights Cold

The chenyitian-shanshu/SIRL-Gurobi32B is a 32.8 billion parameter language model developed by chenyitian-shanshu, based on the Qwen2.5 architecture. It is specifically trained using Solver-Informed Reinforcement Learning (SIRL) to generate accurate mathematical formulations and code for optimization modeling. This model integrates with the Gurobi optimization solver, excelling at translating natural language descriptions into solvable optimization problems and surpassing other models on various optimization benchmarks.

Loading preview...