Mehdi Karimi

Mehdi Karimi

Assistant Professor | Optimization | Machine Learning | AI

Department of Mathematics, Illinois State University

Office: Stevenson Hall, STV 309B

Email: mkarim3@ilstu.edu

Operations of Complex Systems

NSF-Funded Research on Power Networks

This project focuses on developing scalable, decentralized optimization algorithms for managing large-scale power networks with high renewable integration. Supported by a two-year NSF grant, I lead a team of four graduate students in designing advanced techniques for graph-based partitioning, distributed optimization, and parameter tuning to improve efficiency and reliability.

Healthcare & Insurance Analytics (OSF HealthCare)

Collaborating with three faculty and six students, we analyze real-world Social Determinants of Health (SDOH) data to predict Medicare Shared Savings Program (MSSP) claim counts using machine learning and deep learning. The project addresses significant challenges due to missing and incomplete SDOH data, developing robust techniques for healthcare analytics and risk modeling.

Predictive Maintenance Optimization (CERL)

In collaboration with the U.S. Army Construction Engineering Research Laboratory (CERL), we develop reinforcement learning-based predictive maintenance strategies for large-scale infrastructure. The project involves integrating probabilistic degradation modeling with optimization techniques to design cost-effective maintenance policies.

Convex Optimization & Domain-Driven Solver (DDS)

Interior-Point Algorithms for Complex Problems

I design and analyze infeasible-start primal-dual interior-point methods for optimization problems in the Domain-Driven form, where constraints are defined via self-concordant barriers. Our software package, Domain-Driven Solver (DDS), efficiently handles LP, SOCP, SDP, hyperbolic, and entropy-based problems.

Optimization of Quantum Entropy

DDS supports optimization involving quantum entropy and quantum relative entropy, outperforming traditional SDP-based approximations. We are working toward proving a conjectured self-concordant barrier for quantum relative entropy and improving derivative evaluation methods for high-dimensional problems.

Hyperbolic Programming

Hyperbolic programming has gained attention for its connection to sum-of-squares optimization and the generalized Lax conjecture. DDS supports three polynomial representations — monomial lists, straight-line programs, and determinantal formats — while we investigate automatic differentiation techniques to efficiently compute derivatives.

Sum-of-Squares & SDP Complexity

We study the computational complexity of semidefinite programming (SDP) through the lens of sum-of-squares methods, exploring open questions about SDP feasibility in the Turing machine model. These techniques have direct applications in machine learning, including sparse recovery and dictionary learning.