- Home
- Search
- Jonathan C Kao
- All Reviews
Jonathan Kao
AD
Based on 96 Users
Be prepared to spend 20+ hours a week on the homework assignments. I learned a ton from this course. It makes it to where AI/ML is not a black box anymore. You can understand how things are working and how it all comes back to the math.
The lectures are very good. The professor and TAs are very helpful. It is a great course which I would recommend if you are single and have the time.
I would highly recommend this class to any interested in deep learning and machine learning. Professor Kao is a very good lecturer and he does an amazing job explaining concepts. I never truly understood how backpropagation worked until he explained it in class. Anyone interested in research/ML should definitely take this class. You will learn so much.
However, the class is not a cake walk. It's actually fairly easy to get a good grade in this class as long as you put in the effort. There is only one exam around week 8, which won't be bad if you pay attention to lecture (our average for the exam was a 95%). The homeworks are the real killer and can take a very long time. You essentially have to build neural networks from scratch using Python and Numpy.
Overall, this is an amazing class where you can truly learn so much, but at the price of many hours of homework. Professor Kao is probably one of my favorite professors I have ever had at UCLA.
I think the winter 2024 offering of C147 was very similar to the reviews I've read in the past:
- There were some very hard/long assignments. However if you used Piazza you should be fine
- The class was a bit more mathy in the beginning than compared to the end
- The professor was excellent!
There are also several comments complaining about the "slow pace" of the class, "over-enrollment" and new 50% Midterm Weight. I'd like to address those and what my views are as a fellow student:
Personally, as someone who skipped M146 and went straight into C147, I appreciated that the professor reviewed some of the more fundamental aspects of DL in the beginning of the course. Furthermore, I'd like to point out that this is after all, a graduate class. Grad Students often don't get to take the particular sequence of courses at UCLA prior to coming here, so it make sense that the professor do some review in the very beginning to make sure everyone is on the same page.
For those that were interested in "practical" machine learning (ie learning one of the popular ML libs), the final project was an opportunity to do that. Personally, I think this class is not a torch-bootcamp and rightfully so: C147 focus on more fundamental knowledge and the theory side of DL. There are also other more coding-heavy classes offered (CS188 in W24 for example) that one can take to gain that kind of experience.
While it is true this was a very big class, personally between taking the class vs not taking the class I'd always choose the former. I definitely appreciated the effort the teaching staff put into making the logistics work for such a large class!
Finally, the median for the class was an A- (partly due to grad class, also because Kao doesn't grade harsh). I really don't understand why people complain about grades at this point considering this is an upper-div and the midterm median was a B.
I don't have anything to say that others haven't already said, Professor Kao is truly one of the best lecturers at UCLA and I would highly recommend this class if you are interested in Neural Networks and Deep Learning. Also, the TAs for this class were amazing, especially Tonmoy Monsoor. Tonmoy is insanely knowledgeable about the topic and his discussions were super useful for the homeworks!
Grading:
Homework: 40% (5 homeworks)
Midterm: 30%
Final Project: 30%
Extra Credit: 0.5% for filling out class eval, up to 1.5% for participating on piazza (in a useful way), and some extra credit given on the midterm (final question on the exam is optional extra credit)
Jonathan Kao is my favorite professor at UCLA. Highly recommend this organized, fair, and passionate lecturer.
The class is good overall, and Kao is indeed a good professor. Kao’s lecture is very clear and well-paced, and he often has time to answer any questions people may have. At the beginning of the class, Kao warned us about the workload of the class, but I found it pretty manageable. The concepts of this class are very well explained, and Kao even manages to go over some real-life applications (including his own research) to make the class less “theoretical” than it is. I have been to Kao’s office hours only once or twice, and he mainly just went over the HW problems. The piazza of this class is pretty well organized and the TAs are very responsive to questions.
However, I would only give this class a 4 out of 5 overall due to two things: the grading scheme and the exams. Let’s first talk about grading. Kao uses an absolute grading scale, which means that he doesn’t curve unless the class performs really badly overall. This is great in theory because it means that no one is competing against each other. However, in reality, they definitely cannot run a class where everyone gets an easy one, so they inevitably have to try to control the class grade distribution by the exams. But the problem is the TAs kinda just suck at designing exams. The midterm according to Kao and the TAs was designed to be fairly easy, but the median score was lower than what they expected. Honestly, I did feel like the midterm was easier than I thought it would be, but a lot of questions were just so easy that they essentially became trivial. For example, for some questions, you can just tell the answer by inspection, so you wouldn’t try to solve it the way they want you to do it. Anyway, I got 5 points back after fighting for regrades. Because of how badly people performed on the midterm, Kao promised that he would replace the midterm grade with the final grade if people scored higher on the final. So then you would think that the final might be roughly the same difficulty as the midterm, if not easier.
The final was extremely difficult and surprising in terms of content. It was so bad that people were still talking about it on piazza after the exam. Content-wise, “modulation”, a concept that showed up on every single final from past years, was not covered. Instead, we got multiple questions about a really niche concept of Fourier Transform that showed up only once on the HW. Also, the TAs were obsessed with putting the exact same proofs that we went over in class on the exams (both midterm and final). If you went to the review sessions, the TAs would “hint” that there were some proofs from the lecture that you needed to know for the exam, which were the same ones that actually showed up on the exam. I don’t know how I feel about this, to be honest. Maybe the TAs were doing this to make certain parts of the exam easier, but I feel like this was really unfair because people who had the proofs on their cheatsheets obviously had an unfair advantage over those who did not. And you didn’t even have to actually understand the proofs, you just had to be lucky enough - or sensitive enough to their hints - to copy down the proofs. In the end, they gave out a very generous curve and basically raised everyone’s grade by 3%.
Also, your HW is worth more than everything else (40% of your grade), so whether or not you get 100% on it affects your grade significantly. For instance, even though I got a 96/100 on the midterm, because my HW average was only ~95%, I still had to get close to 90% on the final to get an A in this class, which was pretty ridiculous. Also, something that nobody tells you is that you can go to the TAs’ office hours and literally ask them to check your answers with their HW solution manual. Again, I don’t think this is fair for people who cannot make it to office hours or simply don’t know about this hidden secret. So if you are taking this class, make sure to take advantage of this.
Midterm was a copy of the review, and because the median was so high, the TAs made the final extremely difficult Class overall was a lot of work, but Prof. Kao explained the material very well and in a simple matter. Homework was pretty difficult too, and it took a long time to finish.
The first part of this review is to the people who are considering taking this class without prior experience. I highly recommend not taking this class unless you took M146 and have some experience in machine learning, both of which I didn't do (this my fault). The course was very math heavy at the start and Kao doesn't define many of the ML terms that he already expects you to know.
Generally, the workload is also very intense and very much requires that you have an understanding of numpy (it will be extremely painful if you do not). Luckily Tonmoy was very helpful in his office hours for the homeworks, but the homeworks will generally be awful.
The class is very theory based. You learn a lot of how neural networks functions and the function of each hyperparameter, but you won't be taught much of how to use frameworks such as PyTorch or good practices for training a model.
I also personally would have changed the grading scheme a bit. 50% for midterm is a bit excessive. There was also double jeopardy on the backprop question on the midterm: if you made the same small mistake on both sides of the backprop, you would lose the points for both sides; you could lose 4% of your total grade in the class just for making a small algebra mistake. The 2% extra credit is nice though, and the project is graded very leniently.
I don't know what more I can say that other reviews do not state already. Professor Kao is one of the most helpful and kindhearted professors I have had the utmost pleasure to learn from. His use of slides, recorded lectures, zoom livestream, have all helped me keep up with the class without having to worry about missing one or two lectures. The TAs are the best TAs I've ever encountered in my time at UCLA. Yang, Tonmoy, Kaifeng, Shreyas, and Lahari were very helpful; were straightforward with you if you got a question on the homework right or wrong (they don't dance around you and say 'hmm you might be right' or give you some BS answer), no, they help you get to the right answer if you're stuck and they corroborate you if you are correct. The midterm was hard, but expected. The questions mirrored the midterm review closely as the TAs had emphasized, and the TAs are straightforward with you if you ask a question about what's on the midterm. I asked one of them, 'is expectation going to be on the midterm,' to which they simply replied, 'yes.' Office hours were an absolute godsend. Go to them if you are not comfortable with the subject. I had satisfied absolutely NONE of the pre-reqs, so I went to OH to get the help I needed, and it WAS helpful.
I won't sugarcoat it; this class is A LOT of work. It's fairly easy to get an A, but be ready to also put in the time and effort to achieve that grade. I dedicated around 10-15 hours every week to this class (I took CM146, CS143, and DH101 as well for reference). It was highly rewarding and I learned SO much. AI was such a blackbox before I took this class; there was so much hype and pizzazz surrounding it. But after taking C147, it really broke it down into the base parts that go into building a neural network, and though I no longer look at AI mystically, I enjoy learning about it nonetheless. So, for anyone who is interested in this subject or is looking for a CS elective, take C147.
Kao teaches this well. I didn't have any of the prereqs and did fine. Just start assignments early and go to discussions. Many people did not study very much for the exam this year which is why it was lower than previous years. In my opinion we had by far the easiest exam (but the extra credit was very difficult) compared to previous years. The only prereq you really need is multivariable calculus, knowledge of what expectation is, and the most important is probably python and numpy skills. The rest will come. I wish we covered more material, lots of students asked really bad questions during class which kept us behind. Still recommend if you are very interested in deep learning. If you aren't very interested, you may not like the class.
Be prepared to spend 20+ hours a week on the homework assignments. I learned a ton from this course. It makes it to where AI/ML is not a black box anymore. You can understand how things are working and how it all comes back to the math.
The lectures are very good. The professor and TAs are very helpful. It is a great course which I would recommend if you are single and have the time.
I would highly recommend this class to any interested in deep learning and machine learning. Professor Kao is a very good lecturer and he does an amazing job explaining concepts. I never truly understood how backpropagation worked until he explained it in class. Anyone interested in research/ML should definitely take this class. You will learn so much.
However, the class is not a cake walk. It's actually fairly easy to get a good grade in this class as long as you put in the effort. There is only one exam around week 8, which won't be bad if you pay attention to lecture (our average for the exam was a 95%). The homeworks are the real killer and can take a very long time. You essentially have to build neural networks from scratch using Python and Numpy.
Overall, this is an amazing class where you can truly learn so much, but at the price of many hours of homework. Professor Kao is probably one of my favorite professors I have ever had at UCLA.
I think the winter 2024 offering of C147 was very similar to the reviews I've read in the past:
- There were some very hard/long assignments. However if you used Piazza you should be fine
- The class was a bit more mathy in the beginning than compared to the end
- The professor was excellent!
There are also several comments complaining about the "slow pace" of the class, "over-enrollment" and new 50% Midterm Weight. I'd like to address those and what my views are as a fellow student:
Personally, as someone who skipped M146 and went straight into C147, I appreciated that the professor reviewed some of the more fundamental aspects of DL in the beginning of the course. Furthermore, I'd like to point out that this is after all, a graduate class. Grad Students often don't get to take the particular sequence of courses at UCLA prior to coming here, so it make sense that the professor do some review in the very beginning to make sure everyone is on the same page.
For those that were interested in "practical" machine learning (ie learning one of the popular ML libs), the final project was an opportunity to do that. Personally, I think this class is not a torch-bootcamp and rightfully so: C147 focus on more fundamental knowledge and the theory side of DL. There are also other more coding-heavy classes offered (CS188 in W24 for example) that one can take to gain that kind of experience.
While it is true this was a very big class, personally between taking the class vs not taking the class I'd always choose the former. I definitely appreciated the effort the teaching staff put into making the logistics work for such a large class!
Finally, the median for the class was an A- (partly due to grad class, also because Kao doesn't grade harsh). I really don't understand why people complain about grades at this point considering this is an upper-div and the midterm median was a B.
I don't have anything to say that others haven't already said, Professor Kao is truly one of the best lecturers at UCLA and I would highly recommend this class if you are interested in Neural Networks and Deep Learning. Also, the TAs for this class were amazing, especially Tonmoy Monsoor. Tonmoy is insanely knowledgeable about the topic and his discussions were super useful for the homeworks!
Grading:
Homework: 40% (5 homeworks)
Midterm: 30%
Final Project: 30%
Extra Credit: 0.5% for filling out class eval, up to 1.5% for participating on piazza (in a useful way), and some extra credit given on the midterm (final question on the exam is optional extra credit)
The class is good overall, and Kao is indeed a good professor. Kao’s lecture is very clear and well-paced, and he often has time to answer any questions people may have. At the beginning of the class, Kao warned us about the workload of the class, but I found it pretty manageable. The concepts of this class are very well explained, and Kao even manages to go over some real-life applications (including his own research) to make the class less “theoretical” than it is. I have been to Kao’s office hours only once or twice, and he mainly just went over the HW problems. The piazza of this class is pretty well organized and the TAs are very responsive to questions.
However, I would only give this class a 4 out of 5 overall due to two things: the grading scheme and the exams. Let’s first talk about grading. Kao uses an absolute grading scale, which means that he doesn’t curve unless the class performs really badly overall. This is great in theory because it means that no one is competing against each other. However, in reality, they definitely cannot run a class where everyone gets an easy one, so they inevitably have to try to control the class grade distribution by the exams. But the problem is the TAs kinda just suck at designing exams. The midterm according to Kao and the TAs was designed to be fairly easy, but the median score was lower than what they expected. Honestly, I did feel like the midterm was easier than I thought it would be, but a lot of questions were just so easy that they essentially became trivial. For example, for some questions, you can just tell the answer by inspection, so you wouldn’t try to solve it the way they want you to do it. Anyway, I got 5 points back after fighting for regrades. Because of how badly people performed on the midterm, Kao promised that he would replace the midterm grade with the final grade if people scored higher on the final. So then you would think that the final might be roughly the same difficulty as the midterm, if not easier.
The final was extremely difficult and surprising in terms of content. It was so bad that people were still talking about it on piazza after the exam. Content-wise, “modulation”, a concept that showed up on every single final from past years, was not covered. Instead, we got multiple questions about a really niche concept of Fourier Transform that showed up only once on the HW. Also, the TAs were obsessed with putting the exact same proofs that we went over in class on the exams (both midterm and final). If you went to the review sessions, the TAs would “hint” that there were some proofs from the lecture that you needed to know for the exam, which were the same ones that actually showed up on the exam. I don’t know how I feel about this, to be honest. Maybe the TAs were doing this to make certain parts of the exam easier, but I feel like this was really unfair because people who had the proofs on their cheatsheets obviously had an unfair advantage over those who did not. And you didn’t even have to actually understand the proofs, you just had to be lucky enough - or sensitive enough to their hints - to copy down the proofs. In the end, they gave out a very generous curve and basically raised everyone’s grade by 3%.
Also, your HW is worth more than everything else (40% of your grade), so whether or not you get 100% on it affects your grade significantly. For instance, even though I got a 96/100 on the midterm, because my HW average was only ~95%, I still had to get close to 90% on the final to get an A in this class, which was pretty ridiculous. Also, something that nobody tells you is that you can go to the TAs’ office hours and literally ask them to check your answers with their HW solution manual. Again, I don’t think this is fair for people who cannot make it to office hours or simply don’t know about this hidden secret. So if you are taking this class, make sure to take advantage of this.
Midterm was a copy of the review, and because the median was so high, the TAs made the final extremely difficult Class overall was a lot of work, but Prof. Kao explained the material very well and in a simple matter. Homework was pretty difficult too, and it took a long time to finish.
The first part of this review is to the people who are considering taking this class without prior experience. I highly recommend not taking this class unless you took M146 and have some experience in machine learning, both of which I didn't do (this my fault). The course was very math heavy at the start and Kao doesn't define many of the ML terms that he already expects you to know.
Generally, the workload is also very intense and very much requires that you have an understanding of numpy (it will be extremely painful if you do not). Luckily Tonmoy was very helpful in his office hours for the homeworks, but the homeworks will generally be awful.
The class is very theory based. You learn a lot of how neural networks functions and the function of each hyperparameter, but you won't be taught much of how to use frameworks such as PyTorch or good practices for training a model.
I also personally would have changed the grading scheme a bit. 50% for midterm is a bit excessive. There was also double jeopardy on the backprop question on the midterm: if you made the same small mistake on both sides of the backprop, you would lose the points for both sides; you could lose 4% of your total grade in the class just for making a small algebra mistake. The 2% extra credit is nice though, and the project is graded very leniently.
I don't know what more I can say that other reviews do not state already. Professor Kao is one of the most helpful and kindhearted professors I have had the utmost pleasure to learn from. His use of slides, recorded lectures, zoom livestream, have all helped me keep up with the class without having to worry about missing one or two lectures. The TAs are the best TAs I've ever encountered in my time at UCLA. Yang, Tonmoy, Kaifeng, Shreyas, and Lahari were very helpful; were straightforward with you if you got a question on the homework right or wrong (they don't dance around you and say 'hmm you might be right' or give you some BS answer), no, they help you get to the right answer if you're stuck and they corroborate you if you are correct. The midterm was hard, but expected. The questions mirrored the midterm review closely as the TAs had emphasized, and the TAs are straightforward with you if you ask a question about what's on the midterm. I asked one of them, 'is expectation going to be on the midterm,' to which they simply replied, 'yes.' Office hours were an absolute godsend. Go to them if you are not comfortable with the subject. I had satisfied absolutely NONE of the pre-reqs, so I went to OH to get the help I needed, and it WAS helpful.
I won't sugarcoat it; this class is A LOT of work. It's fairly easy to get an A, but be ready to also put in the time and effort to achieve that grade. I dedicated around 10-15 hours every week to this class (I took CM146, CS143, and DH101 as well for reference). It was highly rewarding and I learned SO much. AI was such a blackbox before I took this class; there was so much hype and pizzazz surrounding it. But after taking C147, it really broke it down into the base parts that go into building a neural network, and though I no longer look at AI mystically, I enjoy learning about it nonetheless. So, for anyone who is interested in this subject or is looking for a CS elective, take C147.
Kao teaches this well. I didn't have any of the prereqs and did fine. Just start assignments early and go to discussions. Many people did not study very much for the exam this year which is why it was lower than previous years. In my opinion we had by far the easiest exam (but the extra credit was very difficult) compared to previous years. The only prereq you really need is multivariable calculus, knowledge of what expectation is, and the most important is probably python and numpy skills. The rest will come. I wish we covered more material, lots of students asked really bad questions during class which kept us behind. Still recommend if you are very interested in deep learning. If you aren't very interested, you may not like the class.