header detail 1
header detail 2
Sigma Sqaure AI
  • Home
  • Services
    • Natural Language Processing
    • Data Engineering
    • Computer Vision
    • Data Analytics
    • DevOps
    • App Development
    • Design / UX
    • SEO
  • Projects
  • Solutions
    • Smart Inspector
  • Blogs
  • About Us
  • Contact
menu
close
Sigma Sqaure AI
  • Home
  • Services
    • Natural Language Processing
    • Data Engineering
    • Computer Vision
    • Data Analytics
    • DevOps
    • App Development
    • Design / UX
    • SEO
  • Projects
  • Solutions
    • Smart Inspector
  • Blogs
  • About Us
  • Contact

Understanding Linear Regression – The Ordinary Least Squares

  • Written by junaid
  • June 11, 2022
  • 0 Com
machine learning algorithms
Linear Regression is considered as one of the most basic and widely used Machine Learning techniques. Statistical techniques of linear regression are used to conduct the predictive analysis to predict numeric and real variables. Occasionally you will find yourself reading about “linear regression and Ordinary Least Squares” in connection with a book or research article. If we were to use the words “least squares” and “regression,” you might now that these phrases are referring to linear regression, but how do they relate to each other?

Introduction: What is Linear Regression

Linear regression is a statistical technique that is used to model the relationships between dependent variables and one or more independent variables.The linear relationship, which is built by linear regression, gives a brief explanation of the relation between the value of the dependent variable and the values of the independent variable. linear-regression The value of the dependent variable changes when the value of the independent variable is changed due to the linear relationship between these variables. The mathematical representation of linear regression can be represented as: linear-regression Alpha, Beta = Alpha is the point where the line is the intercept, Beta is the coefficient of Input Variable Epsilon = That symbol represents the error rate Y = It represents the target variable which is also called the dependent variable. X = It represents the predictor variable which is also called an independent variable. There are many different ways to compute linear regression, but the most common methods to compute relationship coefficients are
  • Ordinary Least Squares (OLS)
  • Gradient Descent
We will discuss Gradient Descent in upcoming articles, for now, let’s have a look at OLS method.

The Ordinary Least Squares Method

The OLS method is a statistical technique that is used to estimate the parameters of a linear regression model. The method is based on the principle of least squares, which states that the best estimate of the parameters is obtained by minimizing the sum of squared residuals. In other words, OLS finds the line of best fit by minimizing the distance between the data points and the line itself. Linear Regression is utilized to build a connection among a dependent variable and one or many independent variables. Smallest Square Error is referred to by the term “least squares”. The other method, such as Generalized and Maximum likelihood, are considered other techniques of OLS.

Mathematics of OLS

For a dataset with k examples, the OLS method approaches to find optimal coefficients to satisfy the follow equation. linear-regression-equation Similarly for these multiple independent variables, we can construct matrix of these input variables.matrix To find the values that minimize the error, we can rewrite the equation as minimize-error Now upon re-arranging the first equation, we can write the following equation to calculate target beta-vector. We can see two major products here. First, the product of input matrix with the transpose of this input matrix. Second, the product transpose of input matrix with the dependent variable. Finally the product of these two sub products will result in beta-vector calculation.

Assumptions of OLS Method

One of the assumptions of OLS is that there is a linear relationship between the dependent variable and the explanatory variables. This means that as the values of the explanatory variables change, so too will the values of the dependent variable. Another assumption of OLS is that there is no multicollinearity among the explanatory variables. Multicollinearity occurs when two or more of the explanatory variables are highly correlated with each other. This can lead to problems with interpretability because it can be difficult to determine which variable is having the biggest impact on the dependent variable.
LEAVE A COMMENT Cancel reply
Please Enter Your Comments *

sigmalogo_ver175_162
USA address: 2062, 1603 Capitol Avenue, Suite 413A, Cheyenne, WY, Laramie, US, 82001
PK Address: Office # 15-16, 2nd Floor, Aim Arcade, D-12 Markaz, Islamabad.

contact@sigmasquareai.com

+1 (307) 275-3494

SERVICES
  • Natural Language Processing
  • Computer Vision
  • Data Analytics
  • Data Engineering
  • App Development
  • DevOps
  • Design / UX
  • SEO
QUICK LINKS
  • Projects
  • About Us
Recent Posts
  • MLOps 101 – Before writing your first pipeline
  • Understanding Linear Regression – The Ordinary Least Squares
Contact Us

    Copyright by @Sigmasquareai  All Rights Reserved