joeyzero/Qwen3-4B-Reasoning-Backfill-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The joeyzero/Qwen3-4B-Reasoning-Backfill-v0.1 is an experimental 4 billion parameter model, fine-tuned from Qwen/Qwen3-4B, designed to reconstruct plausible reasoning chains. It specializes in generating stepwise "thinking" traces that connect a user-provided instruction to a fixed solution, without altering the solution itself. This model is primarily intended for reasoning backfill in datasets lacking explicit thought processes, enabling the bootstrapping of process-supervision signals and enhancing auditability of model outputs. It supports a 40960 token context length.

Loading preview...