2018-2019 Operations Research Seminar

Welcome to the Web page of the SFU Operations Research Seminar Series. We are associated with:
CORDS (Centre for Operations Research and Decision Sciences), and
The Department of Mathematics, Simon Fraser University.
Our aim is to meet and discuss Operations Research topics.

Unless noted the talks will be at 3:30 on Thursday in Room 3040, SFU Surrey.
Please contact Tamon Stephen if you would like to speak.

Date Speaker Title and Abstract
February 28th



*SUR 3040*
Xiaorui Li

Ph.D. thesis defence

Senior Supervisor: Z. Lu
Sparse and Low Rank Approximation via Partial Regularization: Models, Theory and Algorithms

Sparse representation and low-rank approximation are fundamental tools in fields of signal processing and pattern analysis. In this thesis, we consider introducing some partial regularizers to these problems in order to neutralize the bias incurred by some large entries (in magnitude) of the associated vector or some large singular values of the associated matrix. In particular, we first consider a class of constrained optimization problems whose constraints involve a cardinality or rank constraint. Under some suitable assumptions, we show that the penalty formulation based on a partial regularization is an exact reformulation of the original problem in the sense that they both share the same global minimizers. We also show that a local minimizer of the original problem is that of the penalty reformulation. Specifically, we propose a class of models with partial regularization for recovering a sparse solution of a linear system. We then study some theoretical properties of these models including existence of optimal solutions, sparsity inducing, local or global recovery and stable recovery. In addition, numerical algorithms are proposed for solving those models, in which each subproblem is solved by a nonmonotone proximal gradient (NPG) method. Despite the complication of the partial regularizers, we show that each proximal subproblem in NPG can be solved as a certain number of one-dimensional optimization problems, which usually have a closed-form solution. The global convergence of these methods are also established. Finally, we compare the performance of our approach with some existing approaches on both randomly generated and real-life instances, and report some promising computational results.
PIMS - SFU Seminar

Dec. 6th

*SUR 5380*

Asia Ivic Weiss

York University
Regular Polyhedra, Polytopes and Beyond

In this talk we summarize the classification of regular polyhedra and polytopes and extend the concept to that of hypertope: a thin, residually connected incidence geometry. We present the characterization of the automorphism groups of regular hypertopes and overview recent results on classification of toroidal hypertopes.
Oct. 6th

*Saturday, 8:30-4:00*


Hosted by UBC Okanagan

Details of the Fall 2018 West Coast Optimization Meeting available from the hosts.
PIMS - SFU Seminar

Oct. 4th
Jong-Shi Pang

Daniel J. Epstein Department of Industrial and Systems Engineering

University of Southern California
Non-problems in Optimization for Statistics

The phrase "non-problems" in the title refers to a collection of adjectives that start with the prefix "non". These include "non-convex", "non-differentiable", "non-traditional", "non-trivial", and "non-stochastic gradient" (as a counter to a topical research theme), all in the context of optimization for statistics. Outside a stand-alone optimization problem, the phrase could include "non-cooperative" game theory as this is an area where there are significant research opportunities when combined with the other "non-subjects". I will present a variety of these non-problems and give a brief summary of my research on them.

Archives of the 2006-07, 2007-08, 2008-09, 2009-10, 2010-11, 2011-12, 2013-14, 2014-15, 2015-16, 2016-17, and 2017-18 SFU Operations Research Seminars.

Last modified September 17th, 2018.