- Home
- Search
- Violet Peng
- COM SCI 162
AD
Based on 13 Users
TOP TAGS
There are no relevant tags for this professor yet.
There are no grade distributions available for this professor yet.
Sorry, no enrollment data is available.
AD
Great introduction to NLP and its modern day applications! I learned so much from this course alone. I liked the way the professor explained the material, how it was super engaging and super relevant to how NLP is actually used in the real world. It has even made me consider a career in the NLP/ML field.
The core topic of this course IMO is language models. You start from the very basics (n-grams), and the professor explains step-by-step how language models have evolved over the past few years, culminating at the various transformer models (BERT, GPT).
Grading was very fair and lenient. There were plenty of extra credit opportunities on the projects, and she gave everyone 5 % boost on their midterm and final grades for completing a course survey. The exams themselves were very fair; as long as you understand the lectures, you should be all set.
I also do want to mention the course project, since it allows you to actually have hands-on experience with how researchers are using NLP in the real world. It is a fairly involved group project, but you learn a ton and it could pay off in the future if you choose to pursue something NLP related. That being said, the project takes a LOT longer than one might think, as training the transformer models on the GPUs took forever and the VM environment in which we trained the models kept crashing.
Great introduction to NLP and its modern day applications! I learned so much from this course alone. I liked the way the professor explained the material, how it was super engaging and super relevant to how NLP is actually used in the real world. It has even made me consider a career in the NLP/ML field.
The core topic of this course IMO is language models. You start from the very basics (n-grams), and the professor explains step-by-step how language models have evolved over the past few years, culminating at the various transformer models (BERT, GPT).
Grading was very fair and lenient. There were plenty of extra credit opportunities on the projects, and she gave everyone 5 % boost on their midterm and final grades for completing a course survey. The exams themselves were very fair; as long as you understand the lectures, you should be all set.
I also do want to mention the course project, since it allows you to actually have hands-on experience with how researchers are using NLP in the real world. It is a fairly involved group project, but you learn a ton and it could pay off in the future if you choose to pursue something NLP related. That being said, the project takes a LOT longer than one might think, as training the transformer models on the GPUs took forever and the VM environment in which we trained the models kept crashing.
Based on 13 Users
TOP TAGS
There are no relevant tags for this professor yet.