Syllabus >

Grading and
other important
information >

Homework >

DS-GA 3001.03: Optimization and Computational Linear Algebra for Data Science
(Fall 2016, NYU CDS)

Afonso S. Bandeira
bandeira [at] cims [dot] nyu [dot] edu

Lectures: Tue 5.10pm-7pm at 12 Waverly Place, room G08

Afonso's Office Hours: Wed 2.30pm-4.30pm at
CDS(60 5th Av.)603  (or by appointment)
Afonso also holds Office Hours for another course on
Tue 10.30am-12.30pm (at CIWW1123 in 251 Mercer St.)

Section Leader:
Vladimir Kobzar.
Sections: Th 4.45pm-5.45pm at 12 Waverly Place, L120
Vlad's Office hours: Mon 11am-12.30pm
at CDS609 or CDS663 (or by appointment)
Vlad also holds Office Hours for another course on Mon
9.30am-11.00am in the same location

Piazza page for this course here.

  • Homework 10 and Extended Syllabus 10 are available
  • Homework 9 and Extended Syllabus 9 are available
  • Homework 8 and Extended Syllabus 8 are available
  • Homework 7 and Extended Syllabus 7 are available
  • Homework 6 and Extended Syllabus 6 are available
  • Homework 5 and Extended Syllabus 5 are available
  • Extended Syllabus for Lecture 4 is available
  • Extended Syllabus for Lecture 3 is available
  • Homework 3 is available
  • Extended Syllabus for Lecture 2 is available
  • Homework 2 is available
  • Please sign up for Piazza here, I will be sending announcements through there.
  • The videos of the lectures are available, login to NYUClasses for more information.
  • Extended Syllabus for Lecture 1 is available
  • Homework 1 is available
  • I am teaching a more advanced (PhD level research-oriented) course in Mathematics of Data Science. If you feel you already have a good working knowledge of Optimization, Linear Algebra, and Probability, you may want to consider taking it.

Syllabus: This course will cover the basics of optimization and computational linear algebra used in Data Science. Contents: Vector spaces and linear transformations: rank, dimension, etc. Linear systems: conditioning, least squares. Singular-value decomposition/principal-component analysis, Rayleigh quotients. Applications: Spectral clustering, dimension reduction, Page Rank. Local optima and global optima. Constrained optimization. Optimality conditions and matrix calculus. Gradient descent and Stochastic Gradient Descent. Newton's method and Quasi-Newton methods (BFGS and L-BFGS). Linear Optimization, Duality, and Convex optimization. Conjugate Gradient. Some Applications: Lasso, compressed sensing. Problems on Graphs.
Important: This course description is preliminary, and so it is subject to change.

Programming: You are welcome to write (easy readable!) code in Julia, Python, or Matlab.
In class, I will try to use Julia as much as possible, and I recommend you give it a try, you can try it on the browser without any set-up!
I am here to help: If you have any question, comment, feedback, want to brainstorm about any research idea, etc, just email me and we'll schedule a time to meet.
Feedback: Also, if you have any comment or feedback on the class (it's going too fast, too slow, you want me to cover more of something, or less of something else, etc) please let me know (in person or through email) or submit a comment to this google form. Having direct feedback from you is the best way for me to try give lectures that you like! (keep in mind that I don't know who sent me the comment or feedback and there is no way for me to answer, for questions use email instead).

Books: All three books are optional and are on reserve in the Courant Library
  • Strang: Introduction to Linear Algebra (there are very good lecture videos based on this book)
  • Nocedal & Wright: Numerical Optimization (should be available online via NYU here)
  • Boyd & Vandenberghe: Convex Optimization (available online here)
Grading and other important information:
  • Grading: 40% Homework, 20% Midterm (Oct 25th in class), 40% Final. Exams are open book/notes.

Homework (typically weekly, due on Tuesday before class at the CDS front desk):

Extended Syllabus: