2018-2019 Operations Research Seminar

Welcome to the Web page of the SFU Operations Research Seminar Series. We are associated with:
CORDS (Centre for Operations Research and Decision Sciences), and
The Department of Mathematics, Simon Fraser University.
Our aim is to meet and discuss Operations Research topics.

Unless noted the talks will be at 3:30 on Thursday in Room 3040, SFU Surrey.
Please contact Tamon Stephen if you would like to speak.

Date Speaker Title and Abstract
June 4th

Joint O.R. and Discrete Math Seminar



*AQ K9509*
Ahmad Abdi

Tepper School of Business
Carnegie Mellon University


Department of Mathematics
London School of Economics

Ideal clutters and k-wise intersecting families

A clutter is *ideal* if the corresponding set covering polyhedron has no fractional vertices, and it is *k-wise intersecting* if the members don’t have a common element but every k members do.

We conjecture that there is a constant k such that every k-wise intersecting clutter is non-ideal.

I will show how this conjecture for k=4 would be an extension of Jaeger’s 8-flow theorem, and how a variation of the conjecture for k=3 would be an extension of Tutte’s 4-flow conjecture. I will also discuss connections to tangential 2-blocks, binary projective geometries, the sums of circuits property, etc.
April 25th

Jean-Philippe Richard

Industrial and Systems Engineering

University of Minnesota
On the Convexification of Permutation-invariant Sets arising in MINLP and Data Problems

Permutation-invariant sets - sets that do not change when the variables are permuted - appear in a variety of optimization problems, including sparse principal component analysis. Solving permutation-invariant optimization problems to global optimality requires the derivation of strong convex relaxations for their feasible region. In this talk, we study how to construct the convex hull of permutation-invariant sets in a higher dimensional space. When projection is possible, we also obtain convex hull descriptions in the original space of variables. We illustrate our techniques by developing (1) a novel reformulation and relaxation for sparse principal component analysis, (2) convex hull of matrices with bounded rank and spectral norms, (3) convex envelopes of multi-linear functions over certain domains, and (4) linear descriptions of logical and sparsity constraints arising in the formulation of 0-1 mixed integer programs. This is joint work with Mohit Tawarmalani (Purdue) and Jinhak Kim (University of South Alabama).
April 5th

*Friday, 1:30*

*RCB 6125* (Burnaby)
Liam Erdos, Ben Gregson, Shane Jace and Martin Zhu

Simon Fraser University
Math 402W Operations Research Clinic project presentation

Coordinating Primary Care Operating Hours to Reduce Acute Care Visits

February 28th



*SUR 3040*
Xiaorui Li

Ph.D. thesis defence

Senior Supervisor: Z. Lu
Sparse and Low Rank Approximation via Partial Regularization: Models, Theory and Algorithms

Sparse representation and low-rank approximation are fundamental tools in fields of signal processing and pattern analysis. In this thesis, we consider introducing some partial regularizers to these problems in order to neutralize the bias incurred by some large entries (in magnitude) of the associated vector or some large singular values of the associated matrix. In particular, we first consider a class of constrained optimization problems whose constraints involve a cardinality or rank constraint. Under some suitable assumptions, we show that the penalty formulation based on a partial regularization is an exact reformulation of the original problem in the sense that they both share the same global minimizers. We also show that a local minimizer of the original problem is that of the penalty reformulation. Specifically, we propose a class of models with partial regularization for recovering a sparse solution of a linear system. We then study some theoretical properties of these models including existence of optimal solutions, sparsity inducing, local or global recovery and stable recovery. In addition, numerical algorithms are proposed for solving those models, in which each subproblem is solved by a nonmonotone proximal gradient (NPG) method. Despite the complication of the partial regularizers, we show that each proximal subproblem in NPG can be solved as a certain number of one-dimensional optimization problems, which usually have a closed-form solution. The global convergence of these methods are also established. Finally, we compare the performance of our approach with some existing approaches on both randomly generated and real-life instances, and report some promising computational results.
PIMS - SFU Seminar

December 6th

*SUR 5380*

Asia Ivic Weiss

York University
Regular Polyhedra, Polytopes and Beyond

In this talk we summarize the classification of regular polyhedra and polytopes and extend the concept to that of hypertope: a thin, residually connected incidence geometry. We present the characterization of the automorphism groups of regular hypertopes and overview recent results on classification of toroidal hypertopes.
October 6th

*Saturday, 8:30-4:00*


Hosted by UBC Okanagan

Details of the Fall 2018 West Coast Optimization Meeting available from the hosts.
PIMS - SFU Seminar

October 4th
Jong-Shi Pang

Daniel J. Epstein Department of Industrial and Systems Engineering

University of Southern California
Non-problems in Optimization for Statistics

The phrase "non-problems" in the title refers to a collection of adjectives that start with the prefix "non". These include "non-convex", "non-differentiable", "non-traditional", "non-trivial", and "non-stochastic gradient" (as a counter to a topical research theme), all in the context of optimization for statistics. Outside a stand-alone optimization problem, the phrase could include "non-cooperative" game theory as this is an area where there are significant research opportunities when combined with the other "non-subjects". I will present a variety of these non-problems and give a brief summary of my research on them.

Archives of the 2006-07, 2007-08, 2008-09, 2009-10, 2010-11, 2011-12, 2013-14, 2014-15, 2015-16, 2016-17, and 2017-18 SFU Operations Research Seminars.

Last modified September 17th, 2018.