COM SCI 162

Natural Language Processing

Description: Lecture, four hours; discussion, two hours; outside study, six hours. Requisite: course 145 or M146. Recommended requisite: course 35L. Introduction to wide range of natural language processing, tasks, algorithms for effectively solving these problems, and methods of evaluating their performance. Focus on statistical and neural-network learning algorithms that train on text corpora to automatically acquire knowledge needed to perform task. Discussion of general issues and present abstract algorithms. Assignments on theoretical foundations of linguistic phenomena and implementation of algorithms. Implemented versions of some of algorithms are provided in order to give feel for how discussed systems really work, and allow for extensions and experimentation as part of course projects. Letter grading.

Units: 4.0
1 of 1
Overall Rating 2.3
Easiness 2.3/ 5
Clarity 2.5/ 5
Workload 3.0/ 5
Helpfulness 2.8/ 5
Most Helpful Review
Winter 2024 - An unfortunate class. The professor seems like a solid human being, very much cares about her teaching and impact, and I hope the course improves accordingly. That being said, this was a very poor machine learning course. I would recommend avoiding this class until reviews suggest it has improved. Presently, you are much better off taking an nlp class in one of the variety of online offerings. In general, I have found there to be a pretty common spectrum of experiences in ML courses. I have seen this in online courses, in undergrad courses at UCLA, and similarly in grad courses I’ve audited. On the one hand, you will encounter classes and professors who are extremely math heavy, with heavy usage of notation that is often assumed to be clear. They are rigorous, but in their best form you will leave with a deep understanding and trust that you know where your assumptions are. On the other hand, you have courses that aim to offer an extremely intuitive explanation. These are difficult to do, and still can provide varying amounts of rigor. I feel like Andrew ng is a very good example here. Kao (at UCLA) is also very strong This course in its current offering felt like it held a middling position, wherein it didn’t provide clear explanations that help you develop intuition about nlp, nor does it provide any amount of rigor in its mathematical explanations. The slides are littered with inconsistencies in notation, the mathematical explanations are poor and don’t seem well thought out, and there are so many errors on homeworks and exams that you are left not really knowing what assumptions you were supposed to have and which you shouldn’t. Violet should consider rethinking her slides, and should reconsider what the overall message she is trying to convey at both the lecture and the slide level. I hope it improves because she has the passion for a good offering.
AD
1 of 1

Adblock Detected

Bruinwalk is an entirely Daily Bruin-run service brought to you for free. We hate annoying ads just as much as you do, but they help keep our lights on. We promise to keep our ads as relevant for you as possible, so please consider disabling your ad-blocking software while using this site.

Thank you for supporting us!