aiseosae/Affine-G4-5EHNj2HZoRYKXtewrXPbvCTixTPdPGQJ6SkaZvrx3GeqEhsc
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Jan 31, 2026License:mitArchitecture:Transformer Open Weights Cold

Kimi-Dev-72B is a 72.7 billion parameter open-source coding LLM developed by the Kimi-Dev Team, specifically optimized for software engineering tasks and issue resolution. It achieves a new state-of-the-art performance of 60.4% on SWE-bench Verified among open-source models. The model is fine-tuned using large-scale reinforcement learning, where it autonomously patches real repositories in Docker and receives rewards for passing entire test suites, ensuring robust and correct solutions.

Loading preview...