Shuibai12138/CDLM-0.5B
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 18, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

CDLM-0.5B by Shuibai12138 is a 0.5 billion parameter Masked Diffusion Language Model (MDLM) fine-tuned from fredzzp/open-dcoder-0.5B, which is based on Qwen2. It utilizes error-aware training with a mixture objective to explicitly supervise incorrect tokens, enhancing its capabilities for targeted refinement and error-aware confidence in code generation tasks. This model excels at iterative code correction and generating accurate code by focusing on identifying and rectifying errors.

Loading preview...