- Home
- Search
- Quanquan Gu
- COM SCI 260
AD
Based on 4 Users
TOP TAGS
There are no relevant tags for this professor yet.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Sorry, no enrollment data is available.
AD
Lectures: He is not the best lecturer. For the more mathematical/theoretical content, I learned more from the book. For the content on neural networks, he didn’t even seem familiar with the slides. But the networks content also is only on quizzes, so I guess it’s ok.
Homework 35%: They are very mathematical questions until the switch into talking about networks, then the homework becomes implementing networks by scratch in Python. It was also a grind for the math homeworks because they were required to be in LaTeX.
Midterm 30%: They generously gave a very similar practice midterm, so as long as you can confidently solve those problems, or at least note how to solve them on your cheatsheet, you’ll score well. It also covers the math content more.
No final, 6 quizzes that are only worth 5% total of grade.
Final project 30%: You can choose from some given proposals that Prof. asked people he knows to provide some ideas for, or pursue your own project if you’re smart. Would recommend trying to take the course with friends/colleagues you trust, so that you can collaborate with no issues.
Overall, I would recommend this class if you are mathematically mature, have an interest in the math in ML theory, or are just very ML-invested. Or have an interest in self-learning all the math and network concepts. I unfortunately have little qualification in any of the above, so I suffered silently. Sunk cost fallacy is real. Shoutout to Lucas Tecot for the HW hints.
This course is definitely among the first courses you would like to take if you major in machine learning. The first half of the course is about the most important theoretical aspects of machine learning, most importantly the approximation-generalization tradeoff. The second half is about typical problems of machine learning like regression, classification, ranking, etc.
The course has fantastic slides. They are clear and are pretty much what you will need for the final exam. Prof. Gu is really good at handling questions, so you needn't worry even though the course is math-heavy because prof. Gu will help you through every equation if you have questions.
The homeworks are challenging and take me a lot of tome, but helps a lot in exam preparation. There are quizzes which are quite easy (mainly about basic concepts). The group project looks scary at first, but you are free to choose from a wide range of topics. The final exam (take-home exam) is pretty like the homeworks.
Very interesting course structure, and will be specially helpful to students whose research involves machine learning. However, the course is way too theoretical and math-heavy and the professor makes this very clear in the first lecture. Sometimes, it used to get difficult to follow the lectures, but I guess that is mainly because of the online delivery of instructions. Thankfully, the course textbook is very good and you can study from the book if you missed the lectures. Just one warning - the work load is just way too much. All assignments are to be submitted in LaTex, which takes a lot of your effort. It is almost as good as studying two courses. By the time the quarter ended, I was exhausted with the subject. The only respite is that grading is veryy relaxed and it is easy to score an A+ or A. The professor even drops the worst score from your homework and quiz and does not include it in the final grade. The TAs were very helpful and in general, the discussion sessions were very informative and helpful for the homeworks. Overall, I did learn a lot from this course, but God, I wished the homeworks were not asked to be done on LaTex.
The professor is truly knowledgeable on the theory of machine learning. The first part of the class, regarding the theory and the proof is interesting, where he successfully made the rigorous mathematical proof easy to follow and enhanced our understanding of Machine Learning. The second part is more practical comparing to the first half of the class.
Lectures: He is not the best lecturer. For the more mathematical/theoretical content, I learned more from the book. For the content on neural networks, he didn’t even seem familiar with the slides. But the networks content also is only on quizzes, so I guess it’s ok.
Homework 35%: They are very mathematical questions until the switch into talking about networks, then the homework becomes implementing networks by scratch in Python. It was also a grind for the math homeworks because they were required to be in LaTeX.
Midterm 30%: They generously gave a very similar practice midterm, so as long as you can confidently solve those problems, or at least note how to solve them on your cheatsheet, you’ll score well. It also covers the math content more.
No final, 6 quizzes that are only worth 5% total of grade.
Final project 30%: You can choose from some given proposals that Prof. asked people he knows to provide some ideas for, or pursue your own project if you’re smart. Would recommend trying to take the course with friends/colleagues you trust, so that you can collaborate with no issues.
Overall, I would recommend this class if you are mathematically mature, have an interest in the math in ML theory, or are just very ML-invested. Or have an interest in self-learning all the math and network concepts. I unfortunately have little qualification in any of the above, so I suffered silently. Sunk cost fallacy is real. Shoutout to Lucas Tecot for the HW hints.
This course is definitely among the first courses you would like to take if you major in machine learning. The first half of the course is about the most important theoretical aspects of machine learning, most importantly the approximation-generalization tradeoff. The second half is about typical problems of machine learning like regression, classification, ranking, etc.
The course has fantastic slides. They are clear and are pretty much what you will need for the final exam. Prof. Gu is really good at handling questions, so you needn't worry even though the course is math-heavy because prof. Gu will help you through every equation if you have questions.
The homeworks are challenging and take me a lot of tome, but helps a lot in exam preparation. There are quizzes which are quite easy (mainly about basic concepts). The group project looks scary at first, but you are free to choose from a wide range of topics. The final exam (take-home exam) is pretty like the homeworks.
Very interesting course structure, and will be specially helpful to students whose research involves machine learning. However, the course is way too theoretical and math-heavy and the professor makes this very clear in the first lecture. Sometimes, it used to get difficult to follow the lectures, but I guess that is mainly because of the online delivery of instructions. Thankfully, the course textbook is very good and you can study from the book if you missed the lectures. Just one warning - the work load is just way too much. All assignments are to be submitted in LaTex, which takes a lot of your effort. It is almost as good as studying two courses. By the time the quarter ended, I was exhausted with the subject. The only respite is that grading is veryy relaxed and it is easy to score an A+ or A. The professor even drops the worst score from your homework and quiz and does not include it in the final grade. The TAs were very helpful and in general, the discussion sessions were very informative and helpful for the homeworks. Overall, I did learn a lot from this course, but God, I wished the homeworks were not asked to be done on LaTex.
The professor is truly knowledgeable on the theory of machine learning. The first part of the class, regarding the theory and the proof is interesting, where he successfully made the rigorous mathematical proof easy to follow and enhanced our understanding of Machine Learning. The second part is more practical comparing to the first half of the class.
Based on 4 Users
TOP TAGS
There are no relevant tags for this professor yet.