zenlm/zen-eco-4b-thinking

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Sep 28, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

The zenlm/zen-eco-4b-thinking model, developed by Hanzo AI and the Zoo Labs Foundation, is an efficient 4 billion parameter language model built on the Zen MoDE (Mixture of Distilled Experts) architecture. It features an extended 32K token context window, specifically optimized for enhanced chain-of-thought reasoning. This model is designed for applications requiring robust logical processing within a compact parameter footprint.

Loading preview...

Zen Eco 4b Thinking: Efficient Reasoning with Extended Context

Zen Eco 4b Thinking is a 4 billion parameter language model developed by Hanzo AI and the Zoo Labs Foundation. It is built upon the Zen MoDE (Mixture of Distilled Experts) architecture, designed for efficient processing and enhanced reasoning capabilities.

Key Capabilities

  • Extended Chain-of-Thought Reasoning: The model is specifically optimized to facilitate and extend chain-of-thought reasoning processes, making it suitable for tasks requiring multi-step logical deduction.
  • Large Context Window: With a substantial 32K token context window, it can process and understand longer inputs and maintain coherence over extended conversations or documents.
  • Efficient Architecture: Utilizing the Zen MoDE architecture, this 4B parameter model aims to deliver strong performance while maintaining computational efficiency.

Good For

  • Applications requiring robust logical processing and multi-step reasoning.
  • Scenarios where a compact model size (4B parameters) is preferred without significantly compromising on reasoning depth.
  • Tasks benefiting from a large context window to handle extensive textual information.