Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

819 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews


Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks


Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

Filter by:

101 - 125 of 198 Reviews for Natural Language Processing with Attention Models

By Utku T

Feb 24, 2021

By Arunjith

Sep 15, 2021

By Ezio X

Nov 1, 2020

By Ivan V S

Feb 16, 2022

By Adarsh W

Jun 3, 2021

By Mario A C S

Oct 30, 2020

By Bhavana s

Nov 29, 2021

By Chunlei Z

Apr 5, 2021

By Zane L

Oct 13, 2020

By Nguyen V T

Jun 13, 2022

By Luis A

Sep 26, 2020

By Roman L

Sep 28, 2020

By Pratyusha P

Jul 17, 2022

By T d

Jan 13, 2021

By Meet G

Dec 28, 2020

By Коняев М Н

Mar 29, 2021

By Xiaoli C

Oct 14, 2020

By Wei X

Dec 30, 2020


Dec 18, 2020

By Nikesh B

May 15, 2021

By Arman I

May 22, 2021


Oct 21, 2020

By Han T

Oct 20, 2020

By Onuigwe V

Oct 25, 2020

By Pranay R

Nov 13, 2020