Saturday, September 24, 2022

LSQ2: An update to LSQ (Casio fx-9750GIII, TI-84 Plus CE)

LSQ2:  An update to LSQ (Casio fx-9750GIII, TI-84 Plus CE)


Least Square Matrix and Correlation


The program LSQ2 fits the allows to fit data to a function with minimal error possible. 


Multiple Linear Regression:

f(x1, x2, x3, ...) = b0 + b1 * x1 + b2 * x2 + b3 * x3 + ....


Polynomial Regression:

f(x) = b0 + b1 * x + b2 * x^2 + b3 * x^3 + ...


General:

f(x) = b0 + b1 * g1(x) + b2 * g2(x) + b3 * g3(x) + ...


f(x1, x2, x3, ...) = b0 * g0(x1, x2, x3, ...) + b1 * g1(x1, x2, x3, ...) + ...


The Matrices X, Y, and B



X is your data matrix and is set up as columns:


[  g0(x),  g1(x),  g2(x), ... ]


Where the function g(x) represents functions applied to every data point x_i.  


Example 1:   f(x) = b0 + b1 * x 


The columns of the data matrix are set up as:

[ 1,  x ]


A column of ones set up solving for a constant.


Example 2:  f(x) = b0 + b1 * x + b2 * x^2


The columns of the data matrix are set up as:

[ 1, x, x^2 ]


Example 3:  f(x0, x1) = b0 + b1 * x1 + b2 * x2


The columns of the data matrix are set up as:

[ 1, x1, x2 ]  (note, not x squared in this case)



Y is the answer matrix, of size n rows and 1 column.  There are n data points. We are fitting the function to y_i.


B is the coefficient matrix, consisting of values b0, b1, b2, ....



Simply put, to find B using the least squares method given the data points:


B = (X^T X)^-1 X^T Y


X^T is the transpose matrix of X



How well does the function fit?  


We can predict y values by multiplying X by B.  


P' = X B 



Determining Coefficient of Correlation:


r^2 = SSreg ÷ SStot = [ B^T X^T Y - (O Y)^2 ÷ n ] ÷ [ Y^T Y - (O Y)^2 ÷ n ]

where O is a ones matrix [[ 1, 1, 1, 1, ... ]] of size 1 x n.  



Casio fx-9750GIII Program:  LSQ2


From the text file:  


'ProgramMode:RUN

ClrText

"2022_-_07_-_19 EWS"

"LEAST SQUARES"

"_Mat _X"?->Mat X

"_Mat _Y"?->Mat Y

Dim Mat Y->List 26

List 26[1]->N

(Trn Mat X*Mat X)^-1*Trn Mat X*Mat Y->Mat B

"_Mat _B:"Disps

Mat BDisps

{1,N}->Dim Mat O

Fill(1,Mat O)

Mat O*Mat Y->Mat S

Mat S*Mat S/N->Mat S

(Trn Mat B*Trn Mat X*Mat Y)-Mat S->Mat R

Mat R*(Trn Mat Y*Mat Y-Mat S)^-1->Mat R

"R_^<2>_:"Disps

Mat R



Listing:


ClrText

"2022-07-19 EWS"

"LEAST SQUARES"

"Mat X"? → Mat X

"Mat Y"? → Mat Y

Dim Mat Y → List 26

List 26[1] → N

(Trn Mat X × Mat X)^-1 × Trn Mat X × Mat Y → Mat B

"Mat B:" ⊿

Mat B ⊿

{1, N} → Dim Mat O

Fill(1, Mat O)

Mat O × Mat Y → Mat S

Mat S × Mat S ÷ N → Mat S

(Trn Mat B × Trn Mat X × Mat Y) - Mat S → Mat R

Mat R × (Trn Mat Y × Mat Y - Mat S)^-1 → Mat R

"R^2:" ⊿

Mat R


Matrices:

Mat X:  data matrix, X

Mat Y:  answer matrix, Y

Mat B: coefficient matrix, B

Mat O: ones matrix

Mat S:  used for calculation

Mat R:  correlation




TI-84 Plus CE Program:  LSQ2  (TI-Basic)



Listing:

"2022-07-19 EWS"
ClrHome
Disp "LEAST SQUARES"
Input "[X]? ", [J]
Input "[Y]? ", [I]
dim([I]) → L6
L6(1) → N
([J]^T [J])^-1 [J]^T [I] → [B]
Disp "[B]: "
Pause [B]
{1,N} → dim([H])
Fill(1,[H])
[H] [I] → [G]
[G] [G] * N^-1 → [G]
[B]^T [J]^T [I] - [G] → [A]
[A] * ([I]^T [I] - [G])^-1 → [A]
Disp "R^2: "
Disp [A]

List:
L6:  [ 2nd ] [ 6 ]

Matrices:
[J]:  data matrix, X
[I]:  answer matrix, Y
[B]: coefficient matrix, B
[H]: ones matrix
[G]:  used for calculation
[A]:  correlation


Examples

Example 1:

Equation: y = b0 + b1 * x1 + b2 * x2

X = [ [ 1, 1, 3 ] [ 1, 2, 4 ] [ 1, 5, 6 ] [ 1, 7, 3 ] [ 1, 7, 2 ] ]
Y = [ [ 0.86 ] [ 0.89 ] [ 0.95 ] [ 0.98 ] [ 0.96 ] ]

Coefficients:  [ [ b0 ] [ b1 ] [ b2 ] ]
B = [ [ 0.8257514451 ] [ 0.01836705202 ] [ 5.953757225E-3 ] ]

Correlation: [ [ 0.9875030926 ] ]

Example 2:

Equation: y = b0 + b1 * x + b2 * x^2

X = [ [ 1, 1, 1^2 ] [ 1, 2, 2^2 ] [ 1, 3, 3^2 ] [ 1, 4, 4^2 ] [ 1, 5, 5^2 ] [ 1, 6, 6^2 ] ]
Y = [ [ 1000 ] [ 1294 ] [ 1511 ] [ 1233 ] [ 1006 ] [ 879 ] ]

Coefficients:
B = [ [ 681.7 ] [ 435.2107143 ] [ -69.30357143 ] ]

Correlation: [ [ 0.8119609681 ] ]


Summary

Function to fit:   
f(x1, x2, x3 ... ) = b0 + b1 * g1(x1, x2, x3, ...) + b2 * g2(x1, x2, x3, ...) + ...

X = data matrix
Y = answer matrix, size n x 1
B = coefficient matrix

Determining the Coefficients:   B = (X^T X)^-1 X^T Y

Predicting Values:  P = X B

Determining Coefficient of Correlation:

r^2 = SSreg ÷ SStot = [ B^T X^T Y - (O Y)^2 ÷ n ] ÷ [ Y^T Y - (O Y)^2 ÷ n ]
where O is a ones matrix [[ 1, 1, 1, 1, ... ]] of size 1 x n.  

Source

Abdi, Hervè.  "Multiple Correlation Coefficient"  Program in Cognition and Neurosciences   https://personal.utdallas.edu/~herve/Abdi-MCC2007-pretty.pdf   Retrieved July 17, 2022.  


All original content copyright, © 2011-2022.  Edward Shore.   Unauthorized use and/or unauthorized distribution for commercial purposes without express and written permission from the author is strictly prohibited.  This blog entry may be distributed for noncommercial purposes, provided that full credit is given to the author. 

Casio fx-CG50 and TI-84 Plus CE: Multiple Linear Regression

Casio fx-CG50 and TI-84 Plus CE: Multiple Linear Regression We have arrived at the last month of 2024. What a crazy year. I hope for...