SamuelBang/AesCoder-4B
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Oct 25, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

AesCoder-4B is a 4 billion parameter model developed by SamuelBang (Microsoft Research Asia, Shanghai Jiao Tong University, Peking University) specifically designed to enhance the aesthetic quality of LLM-generated code, particularly for webpage design. It was trained using the AesCode-358K dataset and an agentic reward feedback system, integrating executability, static, and interactive aesthetics. This model excels at generating visually appealing and functional web code, outperforming larger models like GPT-4o and GPT-4.1 on code aesthetics benchmarks.

Loading preview...