AesCoder-4B is a 4 billion parameter model developed by SamuelBang (Microsoft Research Asia, Shanghai Jiao Tong University, Peking University) specifically designed to enhance the aesthetic quality of LLM-generated code, particularly for webpage design. It was trained using the AesCode-358K dataset and an agentic reward feedback system, integrating executability, static, and interactive aesthetics. This model excels at generating visually appealing and functional web code, outperforming larger models like GPT-4o and GPT-4.1 on code aesthetics benchmarks.
No reviews yet. Be the first to review!