AesCoder-4B is a 4 billion parameter model developed by Microsoft Research Asia, Shanghai Jiao Tong University, and Peking University, specifically designed to enhance the aesthetic quality of LLM-generated code. It was fine-tuned on the AesCode-358K dataset and utilizes agentic reward feedback for joint optimization of functionality and code aesthetics. This model excels at visually-oriented coding tasks, particularly webpage design, and has demonstrated performance comparable to much larger models on code aesthetics benchmarks.
No reviews yet. Be the first to review!