Department of Mathematics, Illinois State University
Office: Stevenson Hall, STV 309B
Email: mkarim3@ilstu.edu
This project focuses on developing scalable, decentralized optimization algorithms for managing large-scale power networks with high renewable integration. Supported by a two-year NSF grant, I lead a team of four graduate students in designing advanced techniques for graph-based partitioning, distributed optimization, and parameter tuning to improve efficiency and reliability.
Collaborating with three faculty and six students, we analyze real-world Social Determinants of Health (SDOH) data to predict Medicare Shared Savings Program (MSSP) claim counts using machine learning and deep learning. The project addresses significant challenges due to missing and incomplete SDOH data, developing robust techniques for healthcare analytics and risk modeling.
In collaboration with the U.S. Army Construction Engineering Research Laboratory (CERL), we develop reinforcement learning-based predictive maintenance strategies for large-scale infrastructure. The project involves integrating probabilistic degradation modeling with optimization techniques to design cost-effective maintenance policies.
I design and analyze infeasible-start primal-dual interior-point methods for optimization problems in the Domain-Driven form, where constraints are defined via self-concordant barriers. Our software package, Domain-Driven Solver (DDS), efficiently handles LP, SOCP, SDP, hyperbolic, and entropy-based problems.
DDS supports optimization involving quantum entropy and quantum relative entropy, outperforming traditional SDP-based approximations. We are working toward proving a conjectured self-concordant barrier for quantum relative entropy and improving derivative evaluation methods for high-dimensional problems.
Hyperbolic programming has gained attention for its connection to sum-of-squares optimization and the generalized Lax conjecture. DDS supports three polynomial representations — monomial lists, straight-line programs, and determinantal formats — while we investigate automatic differentiation techniques to efficiently compute derivatives.
We study the computational complexity of semidefinite programming (SDP) through the lens of sum-of-squares methods, exploring open questions about SDP feasibility in the Turing machine model. These techniques have direct applications in machine learning, including sparse recovery and dictionary learning.