Thursday, September 12, 2019

HP 32SII and TI-66: Curve Fitting

HP 32SII and TI-66:  Curve Fitting

Introduction

The curve fitting program uses the linear regression module to determine the parameters b ("intercept") and m ("slope") in non-linear curves using following transformations:

Logarithmic Regression:  y = b + m * ln x
Transformations:  ( ln x, y, b, m )

Inverse Regression:  y = b + m / x
Transformations:  ( 1/x, y, b, m )

Exponential Regression:  y = b * e^(m * x)
Transformation:  ( x, ln y, e^b, m )

Power Regression:  y = b * x^m
Transformation:  ( ln x, ln y, e^b, m )

Geometric (Exponent) Regression:  y = b * m^x
Transformation:  ( x, ln y, e^b, e^m )

Simple Logistic Regression:  y = 1 / (b + m * e^(-x))
Transformation:  ( e^(-x), 1/y, b, m )

HP 32SII Program:  Curve Fitting

Note:
1.  This can be adapted into the HP 35S under one label.  Just take note of the where the label points are.
2.  The total amount of bytes used is 90.
3.  Flags 1 and 2 are used.  If flag 1 is set, e^m is calculated as slope.  If flag 2 is set, e^b is calculated as intercept.

Program:
// Initialize - LBL X
LBL X
CF 1
CF 2
CLΣ
0
RTN

// Calculation - LBL Y
LBL Y

FS? 2
e^x
STO B
VIEW B
m
FS? 1
e^x
STO M
VIEW M

STO R
VIEW R
RTN

// Logarithmic Regression - LBL L
LBL L
LN 
R/S
GTO L

// Inverse Regression - LBL I
LBL I
1/x
R/S
GTO I

// Exponential Regression - LBL E
LBL E
SF 2
x<>y
LN 
x<>y
R/S
GTO E

// Power Regression - LBL P
LBL P
SF 2
LN 
x<>y
LN 
x<>y
R/S
GTO P

// Geometric/Exponent Regression - LBL G
LBL G
SF 1
SF 2
x<>y
LN
x<>y
R/S 
GTO G

// Simple Logistic Regression - LBL S
LBL S
+/-
e^x
x<>y
1/x 
x<>y
STOP 
GTO S

Instructions:
1.  Clear the statistics data and flags by pressing [XEQ] X.
2.  Enter data points, run the proper label, and press [ Σ+ ] or [ Σ- ].

For example, for Logarithmic fit:
y_data [ENTER] x_data [XEQ] L [ Σ+ ]

Subsequent Data:
y_data [ENTER] x_data [R/S] [ Σ+ ]

This scheme allows for undoing data:
y_data [ENTER] x_data [XEQ] L [ Σ- ]

3.  Calculate intercept (B), slope (M), and correlation (R), press [XEQ] Y.

TI-66 Program:  Curve Fitting

Notes:
1.  This program should be able to entered on a TI-58, TI-58C, or TI-59.  At the time of the posting, I have not done it, so I don't have the key codes.
2.  94 steps are used.  [INV] [SBR] is merged into the RTN step.
3.  Flags 1 and 2 are used.  If flag 1 is set, e^m is calculated as slope.  If flag 2 is set, e^b is calculated as intercept.

Program:
// Initialize - key [ A ]
000 LBL
001 A
002 INV
003 ST.F
004 01
005 INV
006 ST.F
007 02
008 CSR
009 0
010 RTN

// Calculation - key [ A' ]
011 LBL
012 A'
013 OP
014 12
015 INV
016 IF.F
017 02
018 (   // left parenthesis
019 INV
020 LN X
021 LBL
022 (  // left parenthesis
023 STO
024 08
025 R/S
026 X<>T
027 INV
028 IF.F
029 01
030 )  // right parenthesis
031 INV
032 LN X
033 LBL
034 )  // right parenthesis
035 STO 
036 07
037 R/S
038 OP
039 13
040 STO
041 09
042 RTN

// Logarithmic Regression - key [ B ]
043 LBL
044 B
045 LN X
046 X<>T
047 R/S
048 RTN

// Inverse Regression - key [ C ]
049 LBL 
050 C
051 1/X
052 X<>T
053 R/S
054 RTN

// Exponential Regression - [ D ]
055 LBL 
056 D
057 ST.F
058 02
059 X<>T
060 R/S
061 LN X
062 R/S
063 RTN

// Power Regression - [ B' ]
064 LBL
065 B'
066 ST.F
067 02
068 LN X
069 X<>T
070 R/S
071 LN X
072 R/S
073 RTN

// Geometric/Exponent Regression - [ C' ]
074 LBL
075 C'
076 ST.F
077 01
078 ST.F
079 02
080 X<>T
081 R/S
082 LN X
083 R/S
084 RTN

// Simple Logistic Regression - [ D' ]
085 LBL
086 D'
087 +/-
088 INV
089 LN X
090 X<>T
091 R/S
092 1/X
093 R/S
094 RTN

Instructions:
1.  Clear the statistics data and flags by pressing [  ].
2.  Enter data points: enter x, run the proper label, enter y, press [R/S] and press [2nd] ( Σ+ ) or [INV] [2nd] ( Σ+ )  (for  Σ- ).

For example, for Logarithmic fit:
x_data [ B ] y_data [R/S]  [2nd] (Σ+)

This scheme allows for undoing data:
x_data [B] y_data [R/S] [INV] [2nd] (Σ+)

3.  Calculate intercept (B), slope (M), and correlation (R), press [2nd] [ A' ].

Examples

All results are rounded.

Example 1: Logarithmic Regression
Data (x,y):
(33.8, 102.4)
(34.6, 103.8)
(36.1, 105.1)
(37.8, 106.9)

Results:
B:  -33.4580
M:  38.6498
R:  0.9941

y ≈ -33.4580 + 38.6498 ln x

Example 2:  Inverse Regression
Data (x,y):
(100, 425)
(105, 429)
(110, 444)
(115, 480)

B:  823.80396
M:  -40664.72143
R:  -0.91195

y ≈ 823.80396 - 40664.72143/x

Example 3: Simple Logistic Regression
Data (x,y):
(1, 11)
(1.3, 9.615)
(1.6, 8.75)
(1.9, 8.158)
(2.6, 7.308)

B: 0.14675
M: -0.15487
R:  -0.99733

y ≈ 1 / (0.14675 - 0.15487*e^(-x))


Eddie

All original content copyright, © 2011-2019.  Edward Shore.   Unauthorized use and/or unauthorized distribution for commercial purposes without express and written permission from the author is strictly prohibited.  This blog entry may be distributed for noncommercial purposes, provided that full credit is given to the author.


  Casio fx-7000G vs Casio fx-CG 50: A Comparison of Generating Statistical Graphs Today’s blog entry is a comparison of how a hist...