Optimization Reading Group
- WWH 1314
- Wednesday 3:30 - 5:00pm
- Week 0 (17 Jan): Meet to discuss topic assignment
- Week 1 (24 Jan): Line Search Methods -- Sara
- Wolfe Conditions
- Steepest Descent
- Newton's Method
- Convergence Results
- Week 2 (31 Jan): Conjugate Gradients -- Jason
- Week 3 (7 Feb): Quasi Newton Methods -- Marc
- Week 4 (14 Feb): Derivative Free Optimization -- Marc
- Week 5 (21 Feb): Theory of Constrained Optimization -- Denis
- Week 6 (28 Feb): Linear Programming (Simplex) --Koray
- Week 7 (7 Mar): Linear Programming (Interior-Point) --Koray
- 14 March, Spring recess
- Week 8 (21 Mar): Nonlinear constrained optimization -- Jason
- Week 9 (28 mar) IP Methods for Nonlinear Constrained Optimization -- Marc
- Week 10 (4 Apr): PDE-constrained Optimization -- Denis
- Week 11 (11 Apr): Convex Programming -- Sarah
- Week 12 (18 Apr): Integer Programming
- Week 13 (25 Apr): Non-smooth optimization --Marc
- Scribe Notes
Here is a Latex Template file for scribe notes. Please
change the filename to lecture#.tex and fill in the proper information
for the scribe and topic commands. If you need any shortcuts or commands, let me know. I will eventually create a shortcuts.tex file for everyone.
- Text Book and papers
- Numerical Optimization, by Nocedal and Wright.
Some chapters will be
when they become avalable (CIMS password access controlled).
- Jonathan Richard Shewchuk, An Introduction to the Conjugate Gradient Method Without the Agonizing Pain
- Forsgren, Gill and Wright, Interior Point Methods for Nonlinear Optimization, SIAM Review
- Todd, Semidefinite Optimization , Acta Numerica
- Alizadeh, Haeberly and Overton, Primal-Dual Interior-Point Methods for Semidefinite Programming:Convergence Rates, Stability and Numerical Results , SIAM J. Optim
- Burke, Lewis, and Overton, A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization , SIAM J. Optim
(text below here is stolen shamelessly from Prof. Overton's nonlinear optimization course page)
For experimenting with our own optimization programs, Matlab is a good
choice. Matlab is an excellent environment for small-scale numerical
computing. However, its Optimization Toolbox is not very good.
See here for
an excellent up-to-date guide to optimization software.
Optimization problems can be submitted over the web to
Another great resource is the modeling language
Matlab is a product of The MathWorks.
You can order your own copy of
Matlab for $99
or you can use Matlab on the Courant Sparcstation network (or dial in from home).
For Matlab documentation, type "helpdesk" at the Matlab prompt. To get started,
A Free Matlab Online Tutorial or
or look for others by a web
search. You may want to look at a very outdated but still useful
Introductory Matlab Primer (3rd and last edition, postscript file).
There are many books on Matlab; I recommend
Matlab Guide, by
Higham and Higham, but you will find many other resources on the web,
including the latest information on Matlab 7.0.
As an NYU graduate student you have the opportunity to join
SIAM for free. SIAM is the main professional
organization for applied and computational math, and offers a number of
benefits to members. I've been a member since I was a graduate student,
and have benefitted in many ways from my association with SIAM.