zenlm/zen-eco-4b-thinking
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Sep 28, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

The zenlm/zen-eco-4b-thinking model, developed by Hanzo AI and the Zoo Labs Foundation, is an efficient 4 billion parameter language model built on the Zen MoDE (Mixture of Distilled Experts) architecture. It features an extended 32K token context window, specifically optimized for enhanced chain-of-thought reasoning. This model is designed for applications requiring robust logical processing within a compact parameter footprint.

Loading preview...