Linear Regression [Simplest Implementation]

Linear regression using: Direct Method, Inbuilt function, SGD Method
1,8K download
Aggiornato 2 nov 2017

Visualizza la licenza

Linear regression attempts to model the relationship between two variables by fitting a linear equation to observed data. One variable is considered to be an explanatory variable, and the other is considered to be a dependent variable. For example, a modeler might want to relate the weights of individuals to their heights using a linear regression model.
Before attempting to fit a linear model to observed data, a modeler should first determine whether or not there is a relationship between the variables of interest. This does not necessarily imply that one variable causes the other (for example, higher SAT scores do not cause higher college grades), but that there is some significant association between the two variables. A scatterplot can be a helpful tool in determining the strength of the relationship between two variables. If there appears to be no association between the proposed explanatory and dependent variables (i.e., the scatterplot does not indicate any increasing or decreasing trends), then fitting a linear regression model to the data probably will not provide a useful model. A valuable numerical measure of association between two variables is the correlation coefficient, which is a value between -1 and 1 indicating the strength of the association of the observed data for the two variables.
A linear regression line has an equation of the form Y = a + bX, where X is the explanatory variable and Y is the dependent variable. The slope of the line is b, and a is the intercept (the value of y when x = 0).
Reference:
(1) https://www.iist.ac.in/sites/default/files/people/in12167/linear_regression.pdf
(2) Andrew Ng’s lecture note (CS 229)
(3) http://www.stat.yale.edu/Courses/1997-98/101/linreg.htm
Check more Machine Learning stuff:
1. AdaBoost
https://in.mathworks.com/matlabcentral/fileexchange/63156-adaboost

2. SVM using various kernels
https://in.mathworks.com/matlabcentral/fileexchange/63033-svm-using-various-kernels

3. SVM for nonlinear classification
https://in.mathworks.com/matlabcentral/fileexchange/63024-svm-for-nonlinear-classification

4. SMO
https://in.mathworks.com/matlabcentral/fileexchange/63100-smo--sequential-minimal-optimization-

5. Support Vector regression
https://in.mathworks.com/matlabcentral/fileexchange/63060-support-vector-regression

6. Maze Solver using SARSA
https://in.mathworks.com/matlabcentral/fileexchange/63089-sarsa-reinforcement-learning

7. Gauss-Seidel Method, Jacobi Method
https://in.mathworks.com/matlabcentral/fileexchange/63167-gauss-seidel-method--jacobi-method

Cita come

Bhartendu (2024). Linear Regression [Simplest Implementation] (https://www.mathworks.com/matlabcentral/fileexchange/64930-linear-regression-simplest-implementation), MATLAB Central File Exchange. Recuperato .

Compatibilità della release di MATLAB
Creato con R2015a
Compatibile con qualsiasi release
Compatibilità della piattaforma
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Versione Pubblicato Note della release
1.0.0.0