"LinearRegression" (Machine Learning Method)

Details & Suboptions

  • The linear regression predicts the numerical output y using a linear combination of numerical features . The conditional probability is modeled according to , with .
  • The estimation of the parameter vector θ is done by minimizing the loss function 1/2sum_(i=1)^m(y_i-f(theta,x_i))^2+lambda_1sum_(i=1)^nTemplateBox[{{theta, _, i}}, Abs]+(lambda_2)/2 sum_(i=1)^ntheta_i^2, where m is the number of examples and n is the number of numerical features.
  • The following suboptions can be given:
  • "L1Regularization"0value of in the loss function
    "L2Regularization"Automaticvalue of iin the loss function
    "OptimizationMethod"Automaticwhat optimization method to use
  • Possible settings for the "OptimizationMethod" option include:
  • "NormalEquation"linear algebra method
    "StochasticGradientDescent"stochastic gradient method
    "OrthantWiseQuasiNewton"orthant-wise quasi-Newton method
  • For this method, PredictorInformation[PredictorFunction[],"Function"] gives a simple expression to compute the predicted value from the features.

Examples

open allclose all

Basic Examples  (2)

Train a predictor on labeled examples:

In[1]:=
Click for copyable input
Out[1]=

Look at the PredictorInformation:

In[2]:=
Click for copyable input
Out[2]=

Predict a new example:

In[3]:=
Click for copyable input
Out[3]=

Generate two-dimensional data:

In[1]:=
Click for copyable input
Out[1]=

Train a predictor function on it:

In[2]:=
Click for copyable input
Out[2]=

Compare the data with the predicted values and look at the standard deviation:

In[3]:=
Click for copyable input
Out[3]=

Options  (5)

See Also

Predict  PredictorFunction  LinearModelFit  Fit  LeastSquares  GeneralizedLinearModelFit

Related Methods