raafatabualazm/decompiler-v2
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Oct 16, 2025Architecture:Transformer Cold

raafatabualazm/decompiler-v2 is a 4 billion parameter model based on Qwen/Qwen3-4B-Thinking-2507, fine-tuned by raafatabualazm. This model specializes in idiomatic decompilation, converting assembly code into high-level programming languages like Dart and Swift. It was trained using LoRA/DoRA adapters on custom assembly-to-Dart/Swift pairs, making it highly effective for reverse engineering and code analysis tasks.

Loading preview...