Convex optimization is an important special case of mathematical optimization applications in a wide range of disciplines. Convex optimization's powerful and elegant theory has been coupled with faster and more reliable numerical linear algebra software and powerful computers to spread its applications over many fields such as (1) data science,(2) control theory and signal processing, (3) relaxation and randomization for hard nonconvex problems, and (4) robust optimization.
We have released DDS 2.1 in July 2022. This version is accepted for publication in Mathematical Programming Computation. DDS accepts many interesting classes of function/set constraints. You can solve several classes of convex optimization problems using DDS 2.1, ranging from LP and SDP to quantum entorpy and hyperbolic programming. DDS is also easily expandable, and by the discovery of a s.c. barrier, a new set constraint can be added to DDS.
In the context of the rapidly transforming power networks, characterized by millions of controllable nodes, achieving effective solutions necessitates a comprehensive grasp of the system's complexities and advanced mathematical principles. Leveraging a fusion of optimization and engineering expertise, I aim to pioneer innovative strategies, adapting new scalable optimization techniques and data-driven approaches for modern power networks. The primary goal is designing robust algorithms to heighten efficiency, reliability, and adaptability in the face of evolving, decarbonized energy paradigms.
In a very interesting theoretical project, we are using the theory of sum-of-squares of polynomials to attack the Turing machine complexity of semidefinite programs (SDP). Even though an approximate solution for an SDP can be found in polynomial time and having an SDP formulation is desirable in many applications, very little is known about the complexity of SDP in the Turing machine model. Even the question of whether SDP feasibility is in the NP class or not is open. In the first part of this project, we have been studying the complexity of SDP problems arise form the univariate nonnegative polynomials and trying to deepen our understanding of their structure.
Nonnegative polynomials and SOS problems have been studied for more than a century; however, numerous new applications in optimization and machine learning have made them very popular among optimizers. The SOS method has been used to improve guarantees for many approximations algorithms and become a new tool in computational complexity. To mention new applications in machine learning, this method has given new bounds for sparse vector recovery and dictionary learning.