Jan-v1-2509 is a 4 billion parameter agentic language model from janhq, based on the Qwen3-4B-Thinking architecture. This model is designed for enhanced reasoning and tool utilization, building upon the Lucy model family. It achieves 91.1% accuracy on SimpleQA and shows improved performance on various chat benchmarks, making it suitable for complex agentic tasks.
Loading preview...
Jan-v1-2509: Advanced Agentic Language Model
Jan-v1-2509 is an updated release in the Jan Family of models by janhq, specifically designed for agentic reasoning and problem-solving. This 4 billion parameter model is based on the Qwen3-4B-Thinking architecture, which provides enhanced reasoning capabilities and tool utilization, building upon the earlier Lucy model. While it shows a slight decrease in performance on SimpleQA compared to the original Jan-v1, this update delivers improved results on other chat benchmarks and offers overall greater reliability.
Key Capabilities
- Agentic Reasoning: Optimized for complex problem-solving and tool utilization within agentic workflows.
- Question Answering: Achieves 91.1% accuracy on the SimpleQA benchmark, demonstrating strong factual question-answering abilities for its scale.
- Conversational Performance: Shows improved performance across various chat benchmarks, indicating robust conversational and instructional capabilities.
- Integration: Designed for direct integration and optimized use with the Jan App, offering seamless access to its features.
When to Use This Model
Jan-v1-2509 is particularly well-suited for applications requiring:
- Agent-based systems: Its design for agentic reasoning makes it ideal for tasks involving planning, tool use, and multi-step problem-solving.
- Factual Question Answering: Strong performance on SimpleQA suggests its utility in applications needing accurate information retrieval.
- Conversational AI: Improved chat benchmark results make it a reliable choice for chatbots and instructional interfaces.
- Local Deployment: Supports local deployment via vLLM and llama.cpp, offering flexibility for various environments.