agentica-org/DeepCoder-1.5B-Preview
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 7, 2025License:mitArchitecture:Transformer0.1K Open Weights Warm

DeepCoder-1.5B-Preview is a 1.5 billion parameter code reasoning LLM developed by agentica-org, fine-tuned from DeepSeek-R1-Distilled-Qwen-1.5B. It utilizes distributed reinforcement learning with an improved GRPO+ algorithm and iterative context lengthening to achieve strong performance on coding benchmarks. This model excels at code generation and problem-solving, offering a significantly extended context length of 131072 tokens for complex programming tasks.

Loading preview...