fx-5800p: Least Squares Fitting
Program LSQ (least squares) for the Casio fx-5800p. The program LSQ (least squares) allows us to fit a polynomial or a multiple-linear regression line given certain data and outcomes.
Variables:
Mat A = independent data matrix
Mat B = dependent data matrix, 1-column matrix
Mat C = coefficient matrix. This is the matrix LSQ solves for.
Mat D = predictive values
Calculations:
Coefficients
Mat C = (Mat A^T * Mat A)^-1 * Mat A^T * Mat B = (Mat A)^+ * Mat B
Predictive Values
Mat D = Mat A * Mat C
Prog "LSQ"
"LSQ"
"A = IND. DATA"? → Mat A
"B = DEP. DATA"? → Mat B
( Trn( Mat A ) * Mat A )^-1 * Trn( Mat A ) * Mat B → Mat C
"C = COEFS."
Mat C ◢
Mat A * Mat C → Mat D
"D = PRED."
Mat D
Examples
Multiple Linear Regression
Fit the following data to y = a * x1 + b * x2 + c:
(x1, x2, y):
(1, .6, .45)
(2, .3, .49)
(3, .2, .36)
(4, .8, .36)
(5, .6, .39)
Note that the equation has a constant term c. Set up our matrix A with three columns: one corresponding to the term a * x1, the second for the term b * x2, and a column of ones for the constant column c.
Mat A =
[[ 1, .6, 1]
[2, .3, 1]
[3, .2, 1]
[4, .8, 1]
[5, .6, 1]]
Matrix B will have the dependent values, in this case a 5 x 1 matrix.
Mat B =
[[ .45 ]
[ .49 ]
[ .36 ]
[ .38 ]
[ .39 ]]
With the matrices set up, run LSQ. You can enter the required matrices manually (with [ ALPHA ] [ ln ] for the left bracket, [ ALPHA ] [ x^ ] for the right bracket) or used pre-stored matrices. The fx-5800p has room for only six matrices (Mat A - Mat F).
Results:
Coefficients, Mat B =
[[ -0.02381395349 ]
[ 0.0162790697 ]
[ 0.4773023256 ]]
This corresponds to the equation for the line:
y = -0.02381395349 * x1 + 0.0162790697 * x2 + 0.4773023256
Predicted y values, Mat C =
[[ 0.463255813951]
[ 0.4345581395 ]
[ 0.4091162791 ]
[ 0.3950697674 ]
[ 0.368 ]]
Polynomial Regression
Fit the quadratic polynomial of form y = a*x^2 + b*x + c with data:
(-3, -2.25)
(-1, 0)
(1, -0.11)
(3, 2.23)
(5, 5.24)
We have one variable. However the equation has both x and x^2. Hence, we will set up matrix A as follows: first column has x^2 values, second has x values, and third column contains ones because it corresponds to the constant term (c).
Mat A =
[[ 9, -3, 1]
[1, -1, 0]
[1, 1, -0.11]
[9, 3, 1]
[25, 5, 1]]
Again, matrix B has the dependent values:
Mat B =
[[ -2.24 ]
[ 0 ]
[ -0.11 ]
[ 2.23 ]
[ 5.24 ]]
Results:
Mat C =
[[ 0.07125 ]
[ 0.717 ]
[ -0.33425 ]]
which corresponds to: y = 0.07125 * x^2 + 0.717 * x - 0.33425
Mat D =
[[ -1.844 ]
[ -0.98 ]
[ 0.454 ]
[ 2.458 ]
[ 5.032 ]]
Even though the least-squares method isn't perfect, it is super powerful. Until next time, have a great day, stay healthy, and stave off any viruses because they have gone crazy this winter!
Eddie
Source: http://en.m.wikipedia.org/wiki/Regression_analysis
This blog is property of Edward Shore. 2015
A blog is that is all about mathematics and calculators, two of my passions in life.
Friday, January 9, 2015
fx-5800p: Least Squares Fitting
TI 84 Plus CE: Consolidated Debts
TI 84 Plus CE: Consolidated Debts Disclaimer: This blog is for informational and academic purposes only. Financial decisions are your ...