Building Regression Models with scikit-learn
This course covers important techniques such as ordinary least squares regression, moving on to lasso, ridge, and Elastic Net, and advanced techniques such as Support Vector Regression and Stochastic Gradient Descent Regression.
What you'll learn
Regression is one of the most widely used modeling techniques and is much beloved by everyone ranging from business professionals to data scientists. Using scikit-learn, you can easily implement virtually every important type of regression with ease.
In this course, Building Regression Models with scikit-learn, you will gain the ability to enumerate the different types of regression algorithms and correctly implement them in scikit-learn.
First, you will learn what regression seeks to achieve, and how the ubiquitous Ordinary Least Squares algorithm works under the hood. Next, you will discover how to implement other techniques that mitigate overfittings such as Lasso, Ridge and Elastic Net regression. You will then understand other more advanced forms of regression, including those using Support Vector Machines, Decision Trees and Stochastic Gradient Descent. Finally, you will round out the course by understanding the hyperparameters that these various regression models possess, and how these can be optimized. When you are finished with this course, you will have the skills and knowledge to select the correct regression algorithm based on the problem you are trying to solve, and also implement it correctly using scikit-learn.
Table of contents
- Version Check 0m
- Module Overview 1m
- Prerequisites and Course Outline 1m
- Connecting the Dots with Linear Regression 7m
- Minimizing Least Square Error 4m
- Installing and Setting up scikit-learn 3m
- Exploring the Automobile Mpg Dataset 7m
- Visualizing Relationships and Correlations in Features 6m
- Mitigating Risks in Simple and Multiple Regression 6m
- R-squared and Adjusted R-squared 2m
- Regression with Categorical Variables 4m
- Module Summary 1m
- Module Overview 1m
- Overview of Regression Models in scikit-learn 2m
- Overfitting and Regularization 4m
- Lasso, Ridge and Elastic Net Regression 5m
- Defining Helper Functions to Build and Train Models and Compare Results 6m
- Single Feature, Kitchen Sink, and Parsimonious Linear Regression 4m
- Lasso Regression 3m
- Ridge Regression 2m
- Elastic Net Regression 6m
- Module Summary 1m
- Module Overview 1m
- Choosing Regression Algorithms 3m
- Support Vector Regression 5m
- Implementing Support Vector Regression 3m
- Nearest Neighbors Regression 4m
- Implementing K-nearest-neighbors Regression 2m
- Stochastic Gradient Descent Regression 3m
- Implementing Stochastic Gradient Descent Regression 2m
- Decision Tree Regression 4m
- Implementing Decision Tree Regression 1m
- Least Angle Regression 4m
- Implementing Least Angle Regression 1m
- Regression with Polynomial Relationships 2m
- Module Summary 1m