Machine Learning for Real-Time Constrained Optimization: The Case of Optimal Power Flow
摘要
Optimization problems subject to hard constraints are common in time-critical applications such as autonomous driving and high-frequency trading. However, existing iterative solvers often face difficulties in solving these problems in real-time. In this talk, we advocate a machine learning approach -- to employ NN's approximation capability to learn the input-solution mapping of a problem and then pass new input through the NN to obtain a quality solution, orders of magnitude faster than iterative solvers. To date, the approach has achieved promising empirical performance and exciting theoretical development for an essential optimal power flow (OPF) problem in grid operation. A fundamental issue, however, is to ensure NN solution feasibility with respect to the hard constraints, which is non-trivial due to inherent NN prediction errors. To this end, we present two approaches, predict-and-reconstruct and homeomorphic projection, to ensure NN solution strictly satisfies the equality and inequality constraints, respectively. In particular, homeomorphic projection is a low-complexity scheme to guarantee NN solution feasibility for optimization over any set homeomorphic to a unit ball, covering all compact convex sets and certain classes of nonconvex sets. The idea is to (i) learn a minimum distortion homeomorphic mapping between the constraint set and a unit ball using an invertible NN (INN), and then (ii) perform a simple bisection operation concerning the unit ball so that the INN-mapped final solution is feasible with respect to the constraint set with minor distortion-induced optimality loss. We prove the feasibility guarantee and bound the optimality loss under mild conditions. Simulation results, including those for non-convex AC-OPF problems in power grid operation, show that homeomorphic projection outperforms existing methods in solution feasibility and run-time complexity, while achieving similar optimality loss. We will also discuss open problems and future directions.
演讲者简介
Minghua received his B.Eng. and M.S. degrees from the Department of Electronic Engineering at Tsinghua University. He received his Ph.D. degree from the Department of Electrical Engineering and Computer Sciences at University of California Berkeley. He is a Professor of Det. of Data Science, City University of Hong Kong and an Associate Dean (internationalization and industry) of College of Computing. He received the Eli Jury award from UC Berkeley in 2007 (presented to a graduate student or recent alumnus for outstanding achievement in the area of Systems, Communications, Control, or Signal Processing) and The Chinese University of Hong Kong Young Researcher Award in 2013. He also received several best paper awards, including IEEE ICME Best Paper Award in 2009, IEEE Transactions on Multimedia Prize Paper Award in 2009, ACM Multimedia Best Paper Award in 2012, IEEE INFOCOM Best Poster Award in 2021, ACM e-Energy Best Paper Award in 2023, and Gradient AI Research Award in 2024. Coding primitives co-invented by Minghua have been incorporated into Microsoft Windows and Azure Cloud Storage, serving hundreds of millions of users. His recent research interests include online optimization and algorithms, machine learning in power system operation, intelligent transportation, distributed optimization, and delay-critical networking. He is an ACM Distinguished Scientist and an IEEE Fellow. Recently, Dr. Xiang Pan, a PhD student supervised by Minghua, receives the 2024 ACM SIGEnergy Doctoral Dissertation Award.
日期
11 October 2024
时间
16:00:00 - 17:00:00
地点
香港科技大学(广州)E3-2F-202 与 线上
Join Link
Zoom Meeting ID: 923 2265 6521
Passcode: dsa2024
主办方
数据科学与分析学域
联系邮箱
dsarpg@hkust-gz.edu.cn