Before getting into the review, I'd like to provide my previous coursework to give context: CSE 6040, ISYE 6501, MGT6203, CSE6240, ISYE 6669, CSE/ISYE 6740 (A's in all classes). I also took this class in combination with simulation, as I waived out of MGT8803.
I loved this class. The topics were definitely more advanced than CDA and it felt like a machine learning II class for the most part. There are proofs in every homework assignment (just one out of four questions) and exams, but the remaining 3 questions could easily also require a derivation before implementing into code. The TAs were excellent, providing very quick and thorough responses. Also, it felt the students in the class were extremely smart, where most were part of the OMSCS program with courses like AI, ML, ML4T already under their belt. There were not any questions that felt out of place on piazza. If I didn't take my heavy computational workload before this class, it would have been a struggle to excel. It starts out with material of which is considered "appendix" material in ESL, B-splines. These are referred to throughout the rest of the course. Then basic computer vision, tensor decompositions and math, optimization (first section felt like review for me as I took ISYE 6669), but the second section was amazing. The second section covered state of the art techniques used today for optimizing ML algorithms like coordinate descent, ADMM, etc. It then closed with sparse datasets of high dimensions, highly reliant upon the Lasso framework and applying it elsewhere, compressed sensing for example.
I thought the teaching was amazing in the videos as well. The math is absolutely not hidden away from you, and the teacher does a great job walking through exactly what is being shown. He will never just read off the slides without walking you through the material. The amount of videos every couple weeks is lighter than most courses which is nice - most of the work is in doing the homework, doing the reading (either what he gives, or from Stanford/CMU/Princeton. Just like ISYE 6740, I relied a ton upon these university's materials). The references were almost always papers - that's when you know the class is legit.
Sometimes, it feels great getting the problems correct. They seem very scary at first, especially getting into things like matrix completion with a higher-dimension and sparse dataset, but you are armed with the tools to solve the problems. I always referred to the class as the "beast class", due to the respect I had for the material being covered; a first in the degree for me. I loved the applicability to real life - as a data scientist in real life, you deal with sparsity ALL THE TIME. Concepts like tensor math + decomposition will serve me well in the deep learning class.
If you are not very adept in programming in any language of your choice (students can use MATLAB, R, Python), don't have a solid background in linear algebra/multi calc/stats, and haven't taken a ML class beforehand, this class might be a kick in the mouth. The optimization class definitely helped, but I felt the material was taught well enough where you could fill in the gaps if need be here. ML classes cover optimization already. Concepts like SVD/PCA/Kmeans are assumed that you understand fully in depth - you are required to build both a tensor decomposition algorithm from scratch, basically a higher order SVD (HOSVD for short) with a ALS set-up, and robust PCA to handle sparsity. Hence why I say it is more of an ML II Class. From reading previous reviewers, and seeing previously given homework assignments, it feels like a conscious effort was made to make the assignments more rewarding and less reliant upon the example code. It felt like the example code was designed to be what it was - just an example to get your feet wet. I don't know for certain without comparing fully, however. To provide substantiation that the material may be different now, the average on assignments usually hovers around an 80. Highest i've seen is an 89 for a homework, lowest a 70. Definitely not a walk in the park for a generally smarter student population!
I would highly, highly recommend this for any student in the computational track in OMSA or the ML track in OMSCS if you are more seasoned in the degree. It is a class size of around 75-100 students with amazing material in terms of teaching and assignments, especially if you like math.
I do not recommend taking it in the summer, as the optimization modules feel like a must. I just finished the final exam - both the midterm and final felt very fair and like a homework. I am not giving a very hard rating because some problems were more of the CDA style of running a model in a package of your choice, but those were few and far between. It would take a ton of time to even get to that point, and this was very occasional.
The example code given is amazing - I'm sure I'll rely upon it throughout my career.