Natural Language Processing with PyTorch
This course covers the use of advanced neural network constructs and architectures, such as recurrent neural networks, word embeddings, and bidirectional RNNs, to solve complex word and language modeling problems using PyTorch.
What you'll learn
From chatbots to machine-generated literature, some of the hottest applications of ML and AI these days are for data in textual form. In this course, Natural Language Processing with PyTorch, you will gain the ability to design and implement complex text processing models using PyTorch, which is fast emerging as a popular choice for building deep-learning models owing to its flexibility, ease-of-use, and built-in support for optimized hardware such as GPUs. First, you will learn how to leverage recurrent neural networks (RNNs) to capture sequential relationships within text data. Next, you will discover how to express text using word vector embeddings, a sophisticated form of encoding that is supported by out-of-the-box in PyTorch via the torchtext utility. Finally, you will explore how to build complex multi-level RNNs and bidirectional RNNs to capture both backward and forward relationships within data. You will round out the course by building sequence-to-sequence RNNs for language translation. When you are finished with this course, you will have the skills and knowledge to design and implement complex natural language processing models using sophisticated recurrent neural networks in PyTorch.
Table of contents
- Module Overview 1m
- Word Embeddings to Represent Text Data 5m
- Introducing torchtext to Process Text Data 3m
- Feeding Text Data into RNNs 3m
- Setup and Data Cleaning 4m
- Using Torchtext to Process Text Data 8m
- Designing an RNN for Binary Text Classification 5m
- Training the RNN 5m
- Using LSTM Cells and Dropout 2m
- Module Summary 1m
- Module Overview 1m
- Numeric Representations of Words 3m
- Word Embeddings Capture Context and Meaning 4m
- Generating Analogies Using GloVe Embeddings 8m
- Multilayer RNNs 2m
- Bidirectional RNNs 4m
- Data Cleaning and Preparation 8m
- Designing a Multilayer Bidirectional RNN 5m
- Performing Sentiment Analysis Using an RNN 3m
- Module Summary 1m
- Module Overview 1m
- Using Sequences and Vectors with RNNs 4m
- Language Translation Using Encoders and Decoders 2m
- Representing Input and Target Sentences 2m
- Teacher Forcing 3m
- Setting up Helper Functions for Language Translation 3m
- Preparing Sentence Pairs 5m
- Designing the Encoder and Decoder 5m
- Training the Sequence-2-Sequence Model Using Teacher Forcing 9m
- Translating Sentences 4m
- Summary and Further Study 2m