TY - JOUR
T1 - Computationally Efficient Approximations for Distributionally Robust Optimization under Moment and Wasserstein Ambiguity
AU - Cheramin, Meysam
AU - Cheng, Jianqiang
AU - Jiang, Ruiwei
AU - Pan, Kai
PY - 2022/6
Y1 - 2022/6
N2 - Distributionally robust optimization (DRO) is a modeling framework in decision making under uncertainty in which the probability distribution of a random parameter is unknown although its partial information (e.g., statistical properties) is available. In this framework, the unknown probability distribution is assumed to lie in an ambiguity set consisting of all distributions that are compatible with the available partial information. Although DRO bridges the gap between stochastic programming and robust optimization, one of its limitations is that its models for large-scale problems can be significantly difficult to solve, especially when the uncertainty is of high dimension. In this paper, we propose computationally efficient inner and outer approximations for DRO problems under a piecewise linear objective function and with a moment-based ambiguity set and a combined ambiguity set including Wasserstein distance and moment information. In these approximations, we split a random vector into smaller pieces, leading to smaller matrix constraints. In addition, we use principal component analysis to shrink uncertainty space dimensionality. We quantify the quality of the developed approximations by deriving theoretical bounds on their optimality gap. We display the practical applicability of the proposed approximations in a production–transportation problem and a multiproduct newsvendor problem. The results demonstrate that these approximations dramatically reduce the computational time while maintaining high solution quality. The approximations also help construct an interval that is tight for most cases and includes the (unknown) optimal value for a large-scale DRO problem, which usually cannot be solved to optimality (or even feasibility in most cases).
AB - Distributionally robust optimization (DRO) is a modeling framework in decision making under uncertainty in which the probability distribution of a random parameter is unknown although its partial information (e.g., statistical properties) is available. In this framework, the unknown probability distribution is assumed to lie in an ambiguity set consisting of all distributions that are compatible with the available partial information. Although DRO bridges the gap between stochastic programming and robust optimization, one of its limitations is that its models for large-scale problems can be significantly difficult to solve, especially when the uncertainty is of high dimension. In this paper, we propose computationally efficient inner and outer approximations for DRO problems under a piecewise linear objective function and with a moment-based ambiguity set and a combined ambiguity set including Wasserstein distance and moment information. In these approximations, we split a random vector into smaller pieces, leading to smaller matrix constraints. In addition, we use principal component analysis to shrink uncertainty space dimensionality. We quantify the quality of the developed approximations by deriving theoretical bounds on their optimality gap. We display the practical applicability of the proposed approximations in a production–transportation problem and a multiproduct newsvendor problem. The results demonstrate that these approximations dramatically reduce the computational time while maintaining high solution quality. The approximations also help construct an interval that is tight for most cases and includes the (unknown) optimal value for a large-scale DRO problem, which usually cannot be solved to optimality (or even feasibility in most cases).
U2 - https://doi.org/10.1287/ijoc.2021.1123
DO - https://doi.org/10.1287/ijoc.2021.1123
M3 - Journal article
VL - 34
SP - 1768
EP - 1794
JO - INFORMS Journal on Computing
JF - INFORMS Journal on Computing
SN - 1091-9856
IS - 3
ER -