When faced with the challenge of making decisions in presence of multiple uncertainties, a common simplifying heuristic is to assume independence among various sources. Although popular, the effect of this heuristic on the solution quality is little understood. In this talk, I will examine the risk associated with ignoring correlations by introducing a new concept of 'Correlation Gap'. Correlation gap captures the price of correlations in stochastic optimization – a small correlation gap will imply that the efficient heuristic of assuming independence is actually robust against any adversarial correlations, while a large correlation will suggest that it is important to invest more in data collection and learning correlations. Additionally, I will demonstrate that the problem of bounding correlation gap appears as a fundamental question when analyzing the approximation ratios achieved by some randomized algorithmic strategies for deterministic combinatorial optimization problems. I will use function properties such as monotonicity and submodularity, and employ techniques of cross-monotonic cost-sharing schemes from game theory in a novel manner to provide a tight characterization of functions with small correlation gap. Results include small constant bounds for cost functions resulting from many popular applications such as stochastic facility location, Steiner tree network design, minimum spanning tree, single-source rent-or-buy network design etc.
Towards the end of the talk, I will briefly discuss my research on some other topics, including algorithms for online optimization and techniques for prediction market design.