$59.00
Certification

Industry recognized certification enables you to add this credential to your resume upon completion of all courses

Need Custom Training for Your Team?
Get Quote
Call Us

Toll Free (844) 397-3739

Inquire About This Course
Instructor
Josh Browning, Instructor - LASSO and Ridge Regression

Josh Browning

Joshua Browning currently works as a senior data scientist for an e-commerce company. He has 10 years of experience in statistics and data science, working in varied industries from aerospace engineering to international organizations. He has several years experience in education as a mathematics instructor/tutor, and is passionate about helping people understand complex subjects in meaningful and applicable ways.

Instructor: Josh Browning

Penalized regression models and their improvements to traditional regression.

  • Learn how to implement LASSO, Ridge, and Elastic Net Models for efficient data analysis.
  • Instructor has 10 years of experience in statistics and data science, working in various industries from aerospace engineering to international organizations. 

Duration: 3h 57m

Course Description

This course begins with a basic introduction, and examines some of the strengths and weaknesses of traditional linear regression. We’ll also cover some basics of R, as the examples in this course will use the R programming language to analyze data. The second module then dives into LASSO models. We see how the LASSO model can solve many of the challenges we face with linear regression, and how it can be a very useful tool for fitting linear models. We also look at a real world use case: forecasting sales at 83 different stores. The third and final module looks at two additional regularized regression models: Ridge and ElasticNet. We then compare these models, both theoretically and by examining their performance on the forecasting problem from module 2.

What am I going to get from this course?

Implement LASSO, Ridge and Elastic Net models so that they can better analyze data.  These models will help them capture relationships in their data, avoid overfitting, and provide models which will predict better than traditional linear regression.

Prerequisites and Target Audience

What will students need to know or do before starting this course?

This course is taught with the programming language R.  Students not familiar with R should be prepared to spend a bit extra time catching up on some of the basics of R.  Additionally, exposure to linear regression (for example, in an introductory statistics course) would be highly useful.

Who should take this course? Who should not?

  • If you want to learn how to start with regularized regression models.
  • Currently use linear regressions and want to implement better models.
  • Are curious about the ideas of machine learning.
  • You should not take this course if you have successfully implemented and used LASSO, Ridge and Elastic Net models in the past (unless you didn’t understand what you were doing).

Curriculum

Module 1: Strengths and Weaknesses of Linear Regression

01:13:27
Lecture 1 Installing R
16:49

We'll look at how to install R and examine some of the basics of R: comments, plotting, packages, help, etc. Also, we'll install the free RStudio and examine how this will help us when using R.

Lecture 2 Linear Regression Review
17:44

In this lecture, we briefly review the concept of linear regression. How do linear models work? How do they choose one specific line to fit the data? What are the "sums of squares" and how are they used?

Lecture 3 The Problem with Multicollinearity
08:08

Multicollinearity can cause huge problems when fitting linear regression models. In this lecture, we'll explore what multicollinearity means and examine it's impact on an example dataset.

Lecture 4 Detecting Multicollinearity: VIFs
12:28

In this lecture, we'll explore how to detect multicollinearity using the Variance Inflation Factors, or VIFs. This statistic can be very useful for measuring the impact of correlated features.

Lecture 5 The p>n Problem
10:10

Linear regression models also run into problems when the number of predictor variables (commonly written as "p") is more than the number of observations ("n"). We'll investigate what happens in these scenarios.

Lecture 6 Best Linear Unbiased Estimator
08:08

We've examined some shortcomings of linear regression, but in this lecture we'll discuss some of the strengths. In particular, linear regression is the best model (in a statistical sense) if you want an unbiased model that is a linear function of the predictors.

Resource 1 Stepwise Variable Selection

Stepwise regression is an alternative to linear regression in which we select a subset of variables out of all possible features. This approach has some improvements over simple linear regression, and we'll explore how well this performs on some example datasets.

Quiz 1 Module 1 Quiz

Module 2: LASSO

02:05:49
Lecture 7 Regression Penalties
06:05

We'll examine the penalty functions which are optimized by linear and stepwise regression, and then we'll introduce a new penalty corresponding to the ridge regression model. We'll discuss how this penalty works and gain insight into why this improves linear regression models.

Lecture 8 Ridge Models
09:28

We'll fit our first ridge regression model in R! We'll examine how to fit the model, how to predict with the model, and how to extract the coefficients estimated by the model.

Lecture 9 Estimation Along a Path
08:15

We'll learn a bit about the underlying algorithm which is used to fit ridge regression models. This will help us to understand how these models work, and how we can efficiently fit ridge regression models.

Lecture 10 Train/Test Error
14:04

We'll examine the differences between error on the training data and new data (i.e. the "test" set). We'll learn about how important it is to evaluate models on their ability to fit new data, and we'll compare ridge regression models to linear regression models.

Lecture 11 Selecting Lambda
11:35

When running a ridge regression model, many different ridge regression models are generated corresponding to different lambda values. We'll talk a bit about how lambda influences the fit, and we'll see how to select a reasonable value of lambda using an example dataset.

Lecture 12 Other Parameters
13:26

We'll look at some of the parameters available in the R implementation of ridge regression models, and learn when we should use them.

Lecture 13 Dataset Introduction
13:16

We'll take a look at the example dataset that we'll be using in this course. In this lecture, we'll just explore the dataset and create some variables which will be used in later models.

Lecture 14 Forecasting: Building Features
10:00

The real world application in this course will be to use the dataset described in the previous lecture and generate forecasts at the product level. In this lecture, we'll develop some new features which will be useful for this forecasting.

Lecture 15 Forecasting: Using Ridge
10:43

In this lecture, we'll use the dataset and features we've generated so far and generate a forecasting model! We'll fit both linear regression and ridge regression models.

Lecture 16 Forecasting: Cross-Validation
11:39

In order to evaluate the models fit in the previous lecture, we'll introduce the concept of cross-validation for time series. We'll examine the particular product from the previous lecture, and compare the linear and ridge regression models.

Lecture 17 Forecasting: Comparing Results
17:18

In this lecture, we'll generalize the results of the previous lecture to the time series for all products. We'll compare the performance of our two models for all these products, and seek to understand when certain models perform better than others.

Quiz 2 Module 2 Quiz

Module 3: Ridge and ElasticNet

37:19
Lecture 18
06:40
Lecture 19 ElasticNet
05:09

In this lecture, we'll introduce the ElasticNet penalty and examine the differences between this model and the Ridge/LASSO. We'll also discuss the reason for having so many different models.

Lecture 20 Comparing LASSO and Ridge
11:37

We'll use our new models, and apply them to the orange juice sales dataset from the previous module. We'll explore the different results we get between the three models.

Lecture 21 Selecting LASSO vs Ridge vs ElasticNet
13:53

Now that we have several different models to choose from, we need a way to determine the best model. In this lecture, we'll explore how to use cross-validation to select a model, and we'll see which models are the best performers on the orange juice dataset.

Quiz 3 Quiz Module 3
Quiz 4 Mini-Project

In this project, we'll examine how the size of the dataset impacts linear, ridge, and LASSO regression models. So, simulate the following data: x = random uniform variable between -10 and 10 (R: x = runif(n, -10, 10)) y1 = linear function of x plus noise (R: y1 = x + rnorm(n)) y2 = cubic function of x plus noise (R: y2 = .01*(x-5)*(x+7)*x + rnorm(n)) y3 = sine function of x plus noise (R: y3 = sin(x/2) + rnorm(n)) Vary n to get a good range of dataset sizes; I used 10, 30, 100, 1000, 10000, 100000. For each simulated dataset, examine how well linear, ridge, and LASSO fit the data. Given that we have a non-linear relationship between x and y, let's also include polynomial terms in the model (i.e. x^2, x^3, ..., x^10). Solution: I've uploaded my solution as an R file. Of course, your results may be different depending on how you chose your models, but hopefully they are similar to mine.

Reviews

8 Reviews

Jon B

May, 2017

Thanks to the course and its instructor, I could gain hands-on expertness in this field, and implementing LASSO, Ridge and Elastic Net models. It was alway good to learn basics of R that can no doubt help me use R programming language when needed. So I felt I am also equipped with this additional knowledge. Particularly, the teaching on selecting a model, and best-performed models give us practical knowledge in the course. Thanks to the Experfy for organizing this course.

Gary S

May, 2017

I always wanted to improve my knowledge in implementing better linear regression models, and also learn more about regression models and machine learning. In my search, I exactly found this course which has a promise to fulfill my desire. And it did.

Enrique E

May, 2017

I am overall happy to join this course that can help me strengthen my professional competence in linear regression, multicollinearity, regression, lambda, dataset, and forecasting.

Franziskos K

July, 2017

You will be amazed by how much you can pick up just by watching and doing practices. I struggled with the course initially and wound up spending more time, but it was definitely worth it in the end.

Roxana-Cristina M

July, 2017

The course goes from basic linear regression with one input factor to ridge regression, lasso, and kernel regression. Week 3 also deals with relevant machine learning subjects like the bias/variance trade-off, over-fitting and validation to motivate ridge and lasso regression. I rate it as an excellent course for learning.

Lidiia M

July, 2017

The lectures have consistent system of progression, and go at a speed that is easy enough to adequately understand the theories. I recommend this course to everyone that wants to understand regression from the point of view of machine learning.

Théo A

July, 2017

This is one of the most instructive and helpful online classes I've picked up so far. The information provided is generally detailed and relevant. But it is still difficult! The assignments are quite demanding but quite informative.

Luis C

March, 2018

It´s a good course to learn the Lasso, Ridge and Elastic Net Regression. It shows how to deal with the problems that standard OLS cannot handle properly, as high number of variables or few observations in comparision with the variable available. These models can avoid the overfitting problem that you can cause in you use OLS without no regularization.