janhq/Jan-code-4b

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 2, 2026License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

Jan-code-4b by janhq is a 4 billion parameter code-tuned language model built on Jan-v3-4B-base-instruct. It is specifically designed for local execution and quick iteration on everyday coding tasks. This model excels at handling well-scoped coding subtasks reliably, making it suitable as a lightweight coding assistant for generation, editing, refactoring, and debugging, or as a fast worker model in agentic workflows.

Loading preview...

Jan-Code-4B: A Small, Code-Tuned Model

Jan-Code-4B, developed by janhq, is a 4 billion parameter model specifically tuned for coding tasks. Built upon the Jan-v3-4B-base-instruct architecture, it prioritizes local execution and rapid iteration, making it a practical choice for developers.

Key Capabilities

  • Efficient Code Handling: Designed to reliably manage well-scoped coding subtasks.
  • Low Latency & Compute: Optimized for minimal latency and reduced computational requirements, ideal for local deployment.
  • Agentic Workflow Integration: Functions effectively as a lightweight "worker" model within larger agent setups.

Good For

  • Lightweight Coding Assistant: Excellent for code generation, editing, refactoring, and debugging.
  • Sub-agent in Agentic Systems: Can be used to produce patches or tests, complementing a larger planning model.
  • Replacing Haiku in Claude Code Setups: Offers a viable alternative for specific configurations.

Jan-Code-4B is optimized for direct integration with the Jan Desktop App and supports local deployment via vLLM and llama.cpp.