Continuous optimization is the core mathematical science for real-world problems ranging from design of biomolecules to management of investment portfolios. Continuous optimization means finding the minimum or maximum value of a function of one or many real variables, subject to constraints. The constraints usually take the form of equations or inequalities. Continuous optimization has been the subject of study by mathematicians since Newton, Lagrange and Bernoulli.
One major focus of the continuous optimization group at Waterloo is convex optimization, that is, continuous optimization in the case that the objective function and feasible set are both convex. Convex optimization problems have widespread applications in practice and also have special properties that make them amenable to sophisticated analysis and powerful algorithms. Members of the group have carried out fundamental work in convex optimization including new and more efficient algorithms for convex optimization and understanding of the most fundamental properties of convex sets such as properties of the set of positive semidefinite matrices.
A second focus of the group is applications of convex optimization to nonconvex problems. Members of the group have shown how to apply convex optimization to NP-hard combinatorial problems yielding results with surprisingly strong guarantees. The group has also applied convex optimization to the nonconvex problem of sensors that must determine their position in space based on measured distances from other nearby sensors. Finally, the group has developed a theory of general purpose methods to use convex optimization for solving nonconvex optimization problems including one of the hardest in the class, namely, unstructured integer linear programming.
A final focus of the group is on robust solution of very large scale optimization problems. Such problems may require parallel computing, automatic differentiation, and special handling of sparse matrices and vectors. The group recently was awarded a large high-performance cluster by the Canada Foundation for Innovation and the province of Ontario to push the limits of problems that can be tackled with these techniques.
- Michael Best: Portfolio optimization, quadratic programming
- Tom Coleman: Large-Scale Computational Optimization
- Walaa Moursi: Convex analysis, monotone operator theory, projection methods and splitting techniques
- Levent Tunçel: Mathematical optimization and Mathematics of Operations Research
- Steve Vavasis: Continuous optimization, numerical analysis
- Henry Wolkowicz: Applications of optimization and matrix theory to algorithmic development