a memoir on nonlinear regression models mathematical …

122
A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical and Statistical Characteristics Allen R. Overman Agricultural and Biological Engineering University of Florida Copyright 2010 Allen R. Overman

Upload: others

Post on 18-Oct-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

A MEMOIR ON NONLINEAR REGRESSION MODELS

Mathematical and Statistical Characteristics

Allen R. Overman

Agricultural and Biological Engineering

University of Florida

Copyright 2010 Allen R. Overman

Page 2: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

ABSTRACT

This memoir is focused on mathematical and statistical characteristics of nonlinear regression models, and includes a discussion on elements of probability. Particular models are chosen to illustrate various aspects of the procedures. A simple exponential model with two parameters is chosen as the first example. The model is rearranged to a linear form by performing logarithms on the response variable Y. This is referred to as the ‘linearized’ form of the model. Linear regression is then performed on ln Y vs. X (the control variable) to obtain estimates for the exponential parameter b along with the linear correlation coefficient r. Since the correlation coefficient is a measure of system response to the input variable X and reflects scatter in the response data, a decision is then made as to whether the linearized model is adequate or whether nonlinear regression is then needed. The ‘least squares criterion’ is used to determine ‘goodness of fit’ of the model to the data. Second order Newton-Raphson procedure is then selected to minimize the error sum of squares of deviations E between measured and estimated values of the response variable and to obtain optimum estimates of model parameters. In addition standard errors of parameter estimates are calculated using the Hessian matrix for the 2nd derivatives of E with respect to the parameters. This requires the inverse of the Hessian matrix along with the variance of the estimate, from which the standard errors of parameter estimates is obtained. The nonlinear correlation coefficient R is also used as a measure of goodness of fit of the model to the data. Contours of equal probability are then estimated for various levels of uncertainty using the Fisher F statistic. The memoir includes extensive discussion of elements of probability using the binomial expansion first for the natural numbers ,3,2,1n and then for n equal to fractions and negative values. For the natural numbers the expansion leads to finite series, whereas for fractions and negative values it leads to infinite series. All of this was established by Isaac Newton before he invented the calculus and for which he was appointed the second Lucasian professor of mathematics at Cambridge University, and led to his first memoir On the Analysis of Infinite Series. Coupling between discrete and continuous distributions are illustrated using the simple pegboard for linear, triangular, and rectangular configurations. This approach provides a logical foundation for the continuous Gaussian distribution of mathematical statistics. The procedure is further applied to role of dice for a single, two, three, and four dice. The pegboard is judged to be a simpler procedure to grasp and use in practice. Keywords: Models, regression, elements of probability. Acknowledgement: The author expresses appreciation to Amy G. Buhler, Associate University Librarian, University of Florida, for assistance with preparation of this memoir as part of the UF digital library.

i

Page 3: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

ii

Table of Contents

Introduction Mathematical Characteristics Nonlinear Model Linearized Form of the Model Least Squares Criterion Newton Raphson Procedure for Nonlinear Regression Standard Errors of the Estimates Equal Probability Contours of Parameters Error Sum of Squares Near the Optimum Maximum Likelihood and Least Squares Analysis Statistical Analysis of the Model Linearized Form of the Model Nonlinear Regression Standard Errors of the Estimates Equal Probability Contours Dependence of E on b Near Minimum E Summary References Tables 1. Dependence of a response variable (Y) to a control variable (X). 2. Newton – Raphson iterations for nonlinear regression of the exponential model. 3. Newton – Raphson iterations of the exponential model for initial b = –0.5000. 4. Combinations of A and b to satisfy equal probability equation near minimum E. 5. Combinations of A and b to satisfy equal probability equation for 75% probability. 6. Combinations of A and b to satisfy equal probability equation for 95% probability. 7. Combinations of A and b to satisfy equal probability equation for 99% probability. 8. Correlation of E with b using a parabolic model. Figures 1. Dependence of response variable (Y) on control variable (X). Data from Table 1. Curve drawn from Eq. (30). 2. Dependence of ln Y on X. Data from Table 1. Line drawn from Eq. (27). 3. Scatter plot for estimated response variable ( Y ) vs. measured response variable (Y). Line

represents the 45% diagonal. 4. Equal probability contours between parameters A and b. Contours drawn from Table 5

(75%), Table 6 (95%), and Table 7 (99%) probability levels, respectively. Optimum and standard error values of 075.0016.5 A and 0129.05161.0 b are also shown.

5. Dependence of error sum of squares (E) on exponential parameter (b) for linear parameter (A = 5.016). Parabola drawn from Eq. (67).

Page 4: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Introduction Scientific analysis generally involves two essential components: (1) a set of data (measurements or observations) and (2) a conceptual model. The process of drawing inference about the system involves uncertainty in both the data and in the model. In the case of an algebraic mathematical model, regression analysis is used to evaluate the parameters in the model. In regression analysis it is common to minimize the sum of squares of deviations between measured and estimated values of the response variable as the criterion of ‘goodness of fit’ of the model to the data. If all the parameters in the model occur in linear form (such as linear, quadratic, cubic, etc.), then the procedure is called linear regression. If one or more of the parameters in the model occur in nonlinear form (such as exponential), then the procedure is called nonlinear regression. Linear regression is the simpler of the two since it involves linear algebra, whereas nonlinear regression involves an iterative procedure to estimate the parameters. Both methods are illustrated in this memoir. A variety of statistical measures are used to describe the quality of a model with a particular set of data. The first step is optimization of the model to obtain best estimates of the parameters. The next step is to calculate standard errors of the parameter estimates to determine uncertainty in the parameters. Relative error of an estimate is then calculated as standard error divided by the estimate. A scatter plot of estimated vs. measured response variable is often included to illustrate scatter of values and any evidence of bias in the model. It is also possible to draw contours of equal probability (uncertainty) between two parameters by use of Fisher’s F statistic. Many of these points have been addressed in a previous publication (Overman et al., 1990) describing crop response to applied nitrogen with a logistic model. In this document a simple exponential model with one linear and one nonlinear parameter is applied to a set of data. Mathematical and statistical characteristics are discussed in detail to illustrate the various steps involved.

Mathematical Characteristics Nonlinear Model Consider the nonlinear regression model

bXAY expˆ (1) where X is the control variable, Y is the response variable, A is the linear model parameter, and b is the exponential model parameter. For this discussion we consider X to be positive ( ) and Y to be positive ( ). It follows that parameter A must be positive as well. Parameter b can be positive or negative. For positive b it turns out that , whereas for negative b we have . Equation (1) is considered nonlinear in the regression sense because of the exponential parameter b.

0X0Y

/ dXdY0/, dXdYAY

0, AY

Linearized Form of the Model

1

Page 5: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Now Eq. (1) can be converted to a linear form by performing logarithms on Y

bXabXAYZ lnln (2) where a = ln A. Parameters a and b can then be estimated by linear regression of Z vs. X. In Eq. (2) ln represents the natural logarithm. In some cases these estimates of parameters may be deemed sufficient for the purpose at hand. In other cases a more rigorous procedure may be desired, such as nonlinear regression. Least Squares Criterion For regression analysis we begin with a criterion for ‘goodness of fit’ of the model to the data. Define the error sum of squares of deviations (E) between data and model by

n

iii YYE

1

(3)

where n is the number of observations, Yi is the observed value, and is the estimated value

from the model. The goal is to choose parameters A and b to minimize E. This is called the least squares criterion for goodness of fit of the model to the data. For the exponential model this takes the form

iY

n

iii bXAYE

1

2exp (4)

For regression purposes think of E as a function of A and b, say E = E(A, b). At the minimum value of E it can be shown from calculus that

0

dbb

EdA

A

EdE (5)

To minimize E w.r.t. (with respect to) A and b requires that

b

E

A

E

0 (6)

simultaneously. This is called the necessary condition for a minimum. To insure a minimum, the sufficient condition from calculus is

0and02

2

2

2

b

E

A

E (7)

2

Page 6: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

The partial derivatives can be obtained from Eq. (4) and are given by the equations

bXAbXYA

E2expexp2 (8)

bXA

E2exp2

2

2

(9)

bXXAbXXYAb

E

bA

E2exp2exp2

22

(10)

(11)

bXXAbXXYAb

E2expexp2

bXXAbXYXAb

E2exp2exp2 22

2

2

(12)

The subscripts have been omitted for convenience and the cross derivatives have been included for later use in the analysis. The derivative in Eq. (8) can be set to zero, which leads to

bX

bXYA

2exp

exp (13)

Equation (13) gives the optimum estimate of linear parameter A for an assumed value of b. Setting Eq. (11) to zero leads to an implicit equation in parameter b. An iterative procedure is needed to find b which will cause Eq. (11) to vanish. The second order Newton – Raphson procedure is chosen for this purpose (Adby and Dempster, 1974). Newton-Raphson Procedure for Nonlinear Regression An initial estimate of parameter b is chosen in the neighborhood of minimum E. Since we can treat E as a continuous function of b, the derivative at a new value, say , can be related to the derivative at b by Taylor series expansion

bb

34

42

3

3

2

2

)(!3

1)(

!2

1b

b

Eb

b

Eb

b

E

b

E

b

E

bbbbb

(14)

It is implicitly assumed that the series represented by Eq. (14) converges to a finite value. The strategy is to set this new derivative to zero and to truncate the series with the linear term in b , which leads to

3

Page 7: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

b

b

bE

bEb

22 /

/

(15)

A new estimate of parameter b is obtained from

bbb (16) New estimates are then obtained for A, bE / , , 22 / bE b , and b . The procedure is repeated until the criterion is met

b

b (17)

where is typically chosen as 10-3 to 10-5. The final values obtained by this procedure are chosen as optimum , , and minimum bb AA EE , assuming that the procedure converges. It is necessary to choose the initial value of b near the optimum value to insure convergence of the procedure. Convergence requires that the second derivative in Eq. (15) be positive. Standard Errors of the Estimates The next step is to calculate the standard errors of the estimates of the parameters. This procedure requires calculation of the Hessian matrix [H] of the second order derivatives given by

Ab

bbbA

AbAA

b

E

Ab

EbA

E

A

E

HH

HHH

,2

22

2

2

2

(18)

where the derivatives are evaluated at ( ). Since the cross derivatives are equal, it follows that the Hessian matrix is symmetric. The inverse of the Hessian matrix yields the elements

Ab ,

11

111

bbbA

AbAA

HH

HHH (19)

where the inverse Hessian is also symmetric. The variance of the estimate is defined by 2

ˆ XYS

pn

YYS ii

XY

2

ˆ (20)

where p is the number of parameters in the model. The standard errors of the estimates are then given by

4

Page 8: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

2/112ˆ

AAXYHSA

(21)

2/112ˆ

bbXYHSb (22)

The covariance of the estimate is given by

12ˆ,COV

AbXYHSbA (23)

Standard errors of the estimates provide a measure of uncertainty in the parameter estimates for a given model and a particular set of data. Equal Probability Contours of Parameters The next mathematical characteristic which we explore is equal probability contours of A vs. b around the optimum for a chosen level of uncertainty. Note that minimum error is calculated from

n

iii XbAYbAE

1

2exp, (24)

Now the error at some level of probability, q, is related to E by (Draper and Smith, 1981, p. 472)

qpnpF

pn

pbAEbAE ,,1,,

(25)

where p is the number of parameters in the model, q is the probability level, and F is taken from tables for Fisher’s analysis of variance (F statistic). The goal is to obtain combinations of parameters A and b which satisfy

bAEbXAYn

iii ,exp

1

2

= constant (26)

This leads to a plot of A vs. b which satisfies Eq. (26), and leads to an equal probability contour. Error Sum of Squares Near the Optimum The final characteristic which we explore is to examine E vs. b at fixed value of AA . For the case of a linear model this result follows a parabola. Does this relationship hold for the nonlinear exponential model? If so, then we should obtain the parabola

2bbE (27)

5

Page 9: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

where ,, are estimated from values of E vs. b near the optimum (minimum). Maximum Likelihood and Least Squares Analysis This section focuses on the connection between the maximum likelihood method of Fisher and that of the least squares criterion (Frieden, 1983, chapter 14). The challenge is to calibrate a mathematical model to relate the dependent variable (y) to the independent variable (x). Assume that the error in the measurements of yi follows a Gaussian probability density function

2

2

2

)(exp

π2

1)(

i

i

yyp (28)

with mean of and variance of . If we further assume that the error in the predicted values

( ) from the model also follow this same error law with the same mean and variance, then the

probability density function for the error between measured and estimated y is given by

2

iy

2

2

2

)ˆ(exp

π2

1)ˆ,(

ii

ii

yyyyp (29)

For n observations we can assume the joint probability (p) given by the product of individual terms

n

i

iin

ii

yypyyp

12

2

1 2

)ˆ(exp

π2

1)ˆ,(

(30)

This is referred to as the maximum likelihood principle when the parameters of the model have been chosen to maximize the function given by Eq. (30). Such a choice will also maximize the logarithm of p

n

iii

n

iii

n

yynyyp1

2

21

2

2

1π2lnˆ

2

1

π2

1lnln

(31)

In order for ln p to be a maximum, it follows that the error (E) defined by

n

iii yyE

1

2ˆ (32)

must be a minimum. Equation (32) therefore defines the least squares error between measured and predicted values of y based on the assumptions stated.

6

Page 10: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Statistical Analysis of the Model Linearized Form of the Model In this section the procedure is applied to the particular set of data listed in Table 1. The first step is to plot the data to see the trend and scatter (see Figure 1). The decrease in Y with increase in X appears to follow an exponential pattern with negative b. The next step is to plot ln Y vs. X to test this hypothesis (see Figure 2). Since Figure 2 appears to follow a straight line, linear regression of ln Y vs. X leads to the regression equation

XbXaYZ 5524.0670.1lnˆ r = –0.9945 (33) with a correlation coefficient of r = –0.9945. This leads to the prediction equation

XY 5524.0exp31.5ˆ (34) It should be noted that Eq. (34) does not minimize E for Eq. (1), but instead minimizes the error sum of squares for Z. For some purposes Eq. (34) may be deemed adequate for analysis. A more rigorous procedure follows nonlinear regression. The value of parameter b = –0.5524 is then used as a first estimate in the iteration procedure. Nonlinear Regression We now outline the nonlinear regression procedure in detail, as given in Table 2.

5524.0b

1141.53508.2/0858.122exp/exp bXbXYA

4083.35643.11411.53737.81411.52

2expexp2/

bXXAbXXYAbE

7920.1157548.21411.520640.171411.52

2exp2exp2/ 2222

bXXAbXYXAbE

0294.07920.115/4083.3/// 22 bEbEb 5230.00294.05524.0 bbb

5230.0b

0409.54476.2/3381.12 A 7008.07382.10409.58316.80409.52/ bE

5555.1601844.30409.521792.160409.52/ 22 bE

0044.05555.160/7008.0 b 5186.00044.05230.0 b

7

Page 11: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

5186.0b 0251.54631.2/3773.12 A

2619.07666.10251.59034.80251.52/ bE

4710.1642559.30251.523575.160251.52/ 22 bE

0016.04710.164/2619.0 b 5170.00016.05186.0 b

5170.0b

0194.54687.2/3915.12 A 1017.07770.10194.59296.80194.52/ bE

9168.1652823.30194.524228.160194.52/ 22 bE

0006.09168.165/1017.0 b 5164.00006.05170.0 b

5164.0b

0173.54708.2/3968.12 A 0380.07810.10173.59396.80173.52/ bE

4853.1662925.30173.524478.160173.52/ 22 bE

0002.04853.166/0380.0 b 5162.00002.05164.0 b

5162.0b

0165.54715.2/3984.12 A 0200.07822.10165.59424.80165.52/ bE

6255.1662954.30165.524550.160165.52/ 22 bE

00012.06255.166/0200.0 b 5161.00001.05162.0 b

5161.0b

0161.547194.2/39956.12 A 00645.078305.10161.594460.80161.52/ bE

7581.16629766.30161.5246050.160161.52/ 22 bE

000039.07581.166/00645.0 b 5161.00000.05161.0 b

410000076.05161.0

000039.0

b

b

8

Page 12: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

The iterations are terminated at this point and lead to the estimation equation

XY 5161.0exp016.5ˆ (35) Note that the procedure converged to the final values in seven steps. The regression curve in

Figure 1 is drawn from Eq. (35). A scatter plot of Y vs. Y is shown in Figure 3. The question now arises as to convergence if the first estimate is greater than the true value, say b = –0.5000. The steps are outlined below for this case and are summarized in Table 3.

5000.0b 9566.45311.2/5457.12 A

4253.08937.19566.43435.99566.42/ bE

9699.1815804.39566.421369.179566.42/ 22 bE

0023.09699.181/4253.0 b 5023.00023.05000.0 b

5023.0b

9653.45224.2/5244.12 A 4465.18733.19653.41757.99653.4/ bE

5627.1795381.39653.420537.179653.42/ 22 bE

0081.05627.179/4465.1 b 5104.00081.05023.0 b

5104.0b

9951.44927.2/4513.12 A 5872.08215.19951.40398.99951.4/ bE

0235.1723950.39951.426975.169951.42/ 22 bE

0034.00235.172/5872.0 b 5138.00034.05104.0 b

5138.0b

0077.54802.2/4201.12 A 2348.07985.10077.59829.800771.5/ bE

8986.1683369.30077.525565.160077.52/ 22 bE

0014.08986.168/2348.0 b 5152.00014.05138.0 b

9

Page 13: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

5152.0b 0130.54751.2/4077.12 A

0908.07891.10130.59597.80130.5/ bE

6221.1673131.30130.524984.160130.52/ 22 bE

0005.06221.167/0908.0 b 5157.00005.05152.0 b

5157.0b

0147.54734.2/4033.12 A 0376.07858.10147.59515.80147.5/ bE

1526.1673047.30147.524779.160147.52/ 22 bE

0002.01526.167/0376.0 b 5159.00002.05157.0 b

5159.0b

0154.54726.2/4012.12 A 0118.07843.10154.59478.80154.5/ bE

9252.1663008.30154.524684.160154.52/ 22 bE

00007.09252.166/0118.0 b 5160.000007.05159.0 b

5160.0b

0157.54724.2/4007.12 A 00744.07839.10157.59468.80157.5/ bE

8813.1662999.30157.524659.160157.52/ 22 bE

000045.08813.166/00744.0 b 5161.0000045.05160.0 b

410000087.05160.0

000045.0

b

b

The procedure again converges to Eq. (35).

10

Page 14: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Another measure of quality of fit of the model to the data is given by the nonlinear correlation coefficient defined by (Cornell and Berger, 1987)

99674.0396364.23

152098.01

(

ˆ1

2/12/1

2

2

YY

YYR

i

ii (36)

which shows excellent agreement between model and data. The final derivatives are given by

000012.0

47194.20161.539956.1222expexp2

bXAbXY

A

E (37)

094388.447194.222exp22

2

bX

A

E (38)

08866.1778305.10161.529446.82

2exp2exp222

bXXAbXXY

Ab

E

bA

E (39)

000645.078305.10161.594460.80161.52

2expexp2

bXXAbXXYA

b

E (40)

07581.16629766.30161.5246050.160161.52

2exp2exp2 222

2

bXXAbXYXA

b

E (41)

Note that the first derivatives are approximately zero and the second derivatives are positive as required. Standard Errors of the Estimates The second derivatives allow calculation of the Hessian matrix

11

Page 15: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

7581.1668866.17

8866.1794388.4

2

22

2

2

2

b

E

Ab

EbA

E

A

E

HH

HHH

bbbA

AbAA (42)

It follows that the inverse Hessian matrix becomes

0097995332.040354540022.0

40354540022.073305402955.011

111

bbbA

AbAA

HH

HHH (43)

which is symmetric as required. The variance of the estimate is calculated from

016900.0211

152098.0ˆ 2

pn

YYS ii

XY (44)

It follows that the standard errors of the estimates and the covariance become

0747.073305402955.0016900.0 2/12/112ˆ AAXY

HSA (45)

0129.00097995332.0016900.0 2/12/112ˆ bbXY

HSb (46)

000599.040354540022.0016900.0,COV 12

ˆ AbXY

HSbA (47)

Under ideal circumstances the covariance would be zero to signify that the model parameters were uncorrelated. Final estimates of parameters are

075.0016.5 A (48)

0129.05161.0 b (49) with relative errors of

%49.10149.0016.5

0747.0

A

A (50)

%50.20250.05161.0

0129.0

b

b (51)

which are relatively small as desired. A check of the Hessian inverse shows that

12

Page 16: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

10

01

59999999808.000000000002.0

2060000000000.099999999999.0

0097995332.040354540022.0

40354540022.073305402955.0

7581.1668866.17

8866.1794388.41HH

within roundoff as required. We also note that the determinant of the Hessian

0501575868.504930459560.319432035428.8247581.1668866.17

8866.1794388.4

is positive definite as required for convergence. Equal Probability Contours In this section we examine combinations of parameters A and b which lead to equal values of the error sum of squares E. The optimized model is described by

XXbAY 5161.0exp016.5expˆ

(52) which leads to the minimum error sum of squares of

n

i iiiii XYYYE

1

11

1

221521.05161.0exp016.5ˆ (53)

Values of E are calculated for 5161.0,075.0016.5 bA

0129 and for

. Results for these values are listed in Table 4. Other combinations of A and b which lead to the same E are also given. These results provide a contour of equal probabilities which pass through the standard errors for A and b. From the table we note that for A = 5.016 we obtain (b, E) = (–0.5032, 0.1666) and (b, E) = (–0.5290, 0.1652).

.05161.0,016.5 bA

This analysis is now extended to various levels of probability. 75 % probability contour It can be shown that the error at some level of probability, say 75%, can be calculated from

11

1

2exp

2069.062.1211

211521.0%75,,1,

iii bXAY

pnpFpn

pEbAE

(54)

where the value of F is obtained from statistical tables as F(2,9,75%) = 1.62. Combinations of A and b which satisfy Eq. (54) are listed in Table 5. A graph of the probability contour is shown in

13

Page 17: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 4, from which estimates are made of (b, E) = (–0.491, 0.2069) and (b, E) = (–0.543, 0.2069). 95 % probability contour It can be shown that the error at the level of probability of 95% can be calculated from

11

1

2exp

2961.026.4211

211521.0%95,,1,

iii bXAY

pnpFpn

pEbAE

(55)

where the value of F is obtained from statistical tables as F(2,9,95%) = 4.26. Combinations of A and b which satisfy Eq. (55) are listed in Table 6 and shown in Figure 4. Estimates are made of (b, E) = (–0.477, 0.2961) and (b, E) = (–0.560, 0.2961). 99 % probability contour It can be shown that the error at the level of probability of 99% can be calculated from

11

1

2exp

4232.002.8211

211521.0%99,,1,

iii bXAY

pnpFpn

pEbAE

(56)

where the value of F is obtained from statistical tables as F(2,9,99%) = 8.02. Combinations of A and b which satisfy Eq. (56) are listed in Table 7 and shown in Figure 4. Estimates are made of (b, E) = (–0.466, 0.4232) and (b, E) = (–0.577, 0.4232). Dependence of E on b Near Minimum E A summary of E vs. b for which satisfy various probability levels is given Table 8. The question is whether or not this relationship follows a parabola given by

016.5 AA

2bbE (57)

To optimize the parabolic model requires that

14

Page 18: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

2432

32

2

Eb

Eb

E

bbb

bbb

bbn

(58)

where again n is the number of observations used in the analysis. Calculations are carried out for different numbers of points listed in Table 8. n = 9

46312315359.0

21038451.1

3363.2

96648422154.082660996762.142264121.2

82660996762.142264121.26581.4

42264121.26581.49

(59)

which leads to the regression equation

25133.753473.784827.20ˆ bbE (60) The minimum of the parabola occurs at

1608.0,5188.005133.7523473.78 Ebb

b

E (61)

which is inconsistent with . Correlation between 1521.0,5161.0 Eb E and E is given by

EE 9166.00216.0ˆ r = 0.99837 (62) n = 7

44000958010.0

77084361.0

4902.1

25084421553.089753885152.087626821.1

89753885152.087626821.16191.3

87626821.16191.37

(63)

which leads to the regression equation

25410.835400.865631.22ˆ bbE (64) The minimum of the parabola occurs at

1514.0,5179.005410.8325400.86 Ebb

b

E (65)

15

Page 19: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

which is inconsistent with . Correlation between 1521.0,5161.0 Eb E and E is given by

EE 9958.000088.0ˆ r = 0.99778 (66) While this is better than using all nine values, it is still off a bit. n = 5

42398675041.0

46378791.0

8980.0

83583277494.086912411822.033513921.1

86912411822.033513921.15821.2

33513921.15821.25

(67)

which leads to the regression equation

27507.804693.837221.21ˆ bbE (68) The minimum of the parabola occurs at

1522.0,5168.007507.8024693.83 Ebb

b

E (69)

which is near . Correlation between 1521.0,5161.0 Eb E and E is given by

EE 99722.000046.0ˆ r = 0.99905 (70) n = 3

41289835871.0

24985331.0

4842.0

12132717681.084127674042.079920921.0

84127674042.079920921.05481.1

799209215481.13

(71)

which leads to the regression equation

25409.822091.851430.22ˆ bbE (72) The minimum of the parabola occurs at

1520.0,5162.005409.8222091.85 Ebb

b

E (73)

16

Page 20: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

which virtually agrees with . Correlation between 1521.0,5161.0 Eb E and E is given by

EE 00000.100010.0ˆ r = 1 (74) Values of E vs. b are shown in Figure 5, where the parabola is drawn from Eq. (72). As b is changed away from the values deviate from the parabola. Note for b = –0.5524 that E is within the parabolic envelope. For b = –0.5000 note that E is also within the parabolic envelope. In both cases the procedure converges toward the minimum at

.

5161.0b

1521.0E5161.0b

Note that it is important to carry a large number of digits to avoid roundoff errors in the matrix computational procedure.

Summary This memoir has focused on the mathematical and statistical characteristics of a nonlinear regression model. The model assumed an exponential relationship between the control variable (X) and the response variable (Y) as described by Eq. (1). Analysis was performed for a given set of data (Table 1). The first step was to linearize the model to the form of Eq. (2). A plot of the data supported this step as shown in Figure 2. Linear regression of Eq. (2) led to a first estimate of the parameters A and b. This estimate of b was then used to perform nonlinear regression of the model on the data to optimize the values of A and b in order to minimize the error sum of squares (E) between measured and estimated values of Y. It was shown that the Newton – Raphson procedure converged rapidly to the minimum E. Standard errors of the parameters were then estimated which showed low relative errors in the parameters. A further measure of uncertainty in the parameters was illustrated by the equal probability contours for A vs. b for various levels of probability (see Figure 4). The cross in Figure 4 represents the most probable values of parameters A and b for the exponential model and for this particular set of data, i.e. the values which minimize the error sum of squares between measured and predicted values of response variable y. Vertical bars represent the standard error in parameter b around optimum value , while horizontal bars represent the standard error in parameter A around optimum value . These are equivalent to the standard deviation around the mean of a set of measurements which follow a Gaussian distribution. The contours in Figure 4 represent combinations of parameters A and b which produce various levels of uncertainty. Following Fisher’s maximum likelihood method, these represents combinations of equal probability.

bA

It was further shown that E vs. b at optimum A in the neighborhood of minimum E followed parabolic dependence. At this point it seems appropriate to call attention to several general points about data analysis and mathematical models. R. A. Fisher called attention to two elements of uncertainty in this process in his classic article of 1922 (see Bennet, 1971). Uncertainty in data led to his analysis of variance (ANOVA) procedure, while uncertainty in a model led to a subject called Fisher Information (see Frieden, 1998). In her biography of her father, Joan Fisher Box noted that the passion of Fisher’s life was the subject of inference (see Fisher Box, 1978, p. 447), i.e. drawing inference about a system from analysis of a specific set of data.

17

Page 21: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Scientific research can be divided into two approaches: bottom/up and top/down. In the bottom/up approach data (measurements or observations) are examined in order to identify a unifying theory or model (from specific to general), which is commonly referred to as a process of induction. In the top/down approach a general principle is postulated and the consequences of these are developed (from general to specific), which is commonly referred to as the process of deduction. Most research appears to have followed the bottom/up approach. The top/down approach was championed by Einstein and by Paul Dirac (Farmelo, 2009, p. 2, 94, 382). Following a series of lectures by Murray Gell-Mann it appears that Dirac gained increased respect for the bottom/up approach, which Gell-Mann had followed. The work described in this memoir has followed the bottom/up approach. Of course we can’t be certain that the simple exponential model is the very best model possible for the given data. There is always a level of uncertainty. Science progresses by assuming a theory or model and then checking the consequences of the theory through measurements. A final point has to do with pursuit of knowledge and understanding of how nature really works. I will call this the battle between ‘subjective and objective’ criteria for judging the values of ideas in science. According to James Glanz (see Chang, 2000, p. 354) the theoretical physicist Steven Weinberg has battled with thinkers and philosophers of science over this issue. ‘Today, one of his major battles is with postmodernist thinkers and philosophers of science who maintain that scientific theories reflect not objective reality but social negotiations among scientists. In its rawest form, this philosophy would say that the theories of the most persuasive or politically powerful scientists become accepted fact.’ I was trained on the belief in objective criteria, and I still hold to this view. Otherwise, it becomes a battle for power and control of ideas in science based on personalities. Unfortunately I have observed an increasing trend to cite ‘experts’ as the source of ‘truth’ in the evaluation of scientific ideas. Some editors and reviewers seem to find this an attractive alternative in the peer review process.

References Adby, P.R. and M.A.H. Dempster. 1974. 1974. Introduction to Optimization Methods. John

Wiley & Sons. New York, NY. Bennett, J.H. 1971. Collected Papers of R.A. Fisher. Vol 1 (1912-1924). University of Adelaide. Chang, L. 2000. Scientists at Work: Profiles of Today’s Groundbreaking Scientists from Science

Times. McGraw – Hill. New York, NY. Cornell, J.A. and R.D. Berger. 1987. Factors that influence the value of the coefficient of

determination in simple linear and nonlinear regression models. Phytopathology 77:63-70. Draper, N.R. and H. Smith. 1981. Applied Regression Analysis. John Wiley & Sons. New York,

NY. Farmelo, G. 2009. The Strangest Man: The Hidden Life of Paul Dirac, Mystic of the Atom. Basic

Books. New York, NY. Fisher Box, J. 1978. R.A. Fisher: The Life of a Scientist. John Wiley & Sons. New York, NY. Frieden, B.R. 1983. Probability, Statistical Optics, and Data Testing. Springer-Verlag. New

York, NY. Frieden, B.R. 1998. Physics from Fisher Information: A Unification. Cambridge University

Press. New York, NY. Overman, A.R., F.G. Martin, and S.R. Wilkinson. 1990. A logistic equation for yield response of

forage grass to nitrogen. Commun. Soil Science and Plant Analysis 21:595-609.

18

Page 22: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 1. Dependence of a response variable (Y) on a control variable (X). X Y ln Y 0.0 5.0 1.609 0.5 4.0 1.386 1.0 2.8 1.030 1.5 2.2 0.788 2.0 2.0 0.693 2.5 1.5 0.405 3.0 1.0 0.000 3.5 0.9 –0.105 4.0 0.6 –0.511 4.5 0.4 –0.916 5.0 0.3 –1.204 Table 2. Newton – Raphson iterations of the exponential model for initial b = –0.5524.

X Y exp (bX) Y 0.0 5.0 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.00000 5.016 0.5 4.0 0.7587 0.7699 0.7716 0.7722 0.7724 0.7725 0.77256 3.875 1.0 2.8 0.5756 0.5927 0.5954 0.5963 0.5967 0.5968 0.59684 2.994 1.5 2.2 0.4367 0.4564 0.4594 0.4605 0.4609 0.4610 0.46110 2.313 2.0 2.0 0.3313 0.3513 0.3544 0.3556 0.3560 0.3561 0.35622 1.787 2.5 1.5 0.2513 0.2705 0.2735 0.2746 0.2750 0.2751 0.27520 1.380 3.0 1.0 0.1907 0.2082 0.2110 0.2120 0.2124 0.2125 0.21261 1.066 3.5 0.9 0.1447 0.1603 0.1628 0.1637 0.1641 0.1642 0.16425 0.824 4.0 0.6 0.1097 0.1234 0.1256 0.1264 0.1267 0.1268 0.12689 0.637 4.5 0.4 0.0833 0.0950 0.0969 0.0976 0.0979 0.0980 0.09803 0.492 5.0 0.3 0.0632 0.0732 0.0748 0.0754 0.0756 0.0757 0.07574 0.380 b –0.5524 –0.5230 –0.5186 –0.5170 –0.5164 –0.5162 –0.5161 –0.5161 A 5.1411 5.0409 5.0251 5.0194 5.0173 5.0165 5.0161 5.0161

bE / –3.4083 –0.7008 –0.2619 –0.1017 –0.0380 –0.0200 –0.00645 22 / bE 115.7920 160.5555 164.4710 165.9168 166.4853 166.6255 166.7581

b 0.0294 0.0044 0.0016 0.0006 0.0002 0.0001 0.00004 b –0.5230 –0.5186 –0.5170 –0.5164 –0.5162 –0.5161 –0.51614

19

Page 23: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 3. Newton – Raphson iterations of the exponential model for initial b = –0.5000. X Y exp (bX) 0.0 5.0 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 0.5 4.0 0.7788 0.7779 0.7748 0.7734 0.7729 0.7727 0.7726 1.0 2.8 0.6065 0.6051 0.6003 0.5982 0.5974 0.5971 0.5970 1.5 2.2 0.4724 0.4707 0.4651 0.4627 0.4617 0.4614 0.4612 2.0 2.0 0.3679 0.3662 0.3603 0.3579 0.3569 0.3565 0.3564 2.5 1.5 0.2865 0.2849 0.2792 0.2768 0.2758 0.2755 0.2753 3.0 1.0 0.2231 0.2216 0.2163 0.2141 0.2132 0.2129 0.2127 3.5 0.9 0.1738 0.1724 0.1676 0.1656 0.1648 0.1645 0.1644 4.0 0.6 0.1353 0.1341 0.1298 0.1281 0.1274 0.1271 0.1270 4.5 0.4 0.1054 0.1043 0.1006 0.0991 0.0984 0.0982 0.0981 5.0 0.3 0.0821 0.0811 0.0779 0.0766 0.0761 0.0759 0.0758 b –0.5000 –0.5023 –0.5104 –0.5138 –0.5152 –0.5157 –0.5159 A 4.9566 4.9653 4.9951 5.0077 5.0130 5.0147 5.0154

bE / +0.4253 +1.4465 +0.5872 +0.2348 +0.0908 +0.0376 +0.0118 22 / bE 181.9699 179.5627 172.0235 168.8986 167.6221 167.1526 166.9252

b –0.0023 –0.0081 –0.0034 –0.0014 –0.0005 –0.0002 –0.00007 b –0.5023 –0.5104 –0.5138 –0.5152 –0.5157 –0.5159 –0.5160 Table 3. (Continued).

X Y exp (bX) Y YY ˆ 0.0 5.0 1.000000 1.000000 1.000000 1.000000 1.0000000 5.016 –0.016 0.5 4.0 0.772607 0.772589 0.772583 0.772580 0.7725794 3.875 +0.125 1.0 2.8 0.596921 0.596894 0.596884 0.596880 0.5968789 2.994 –0.194 1.5 2.2 0.461185 0.461154 0.461143 0.461138 0.4611364 2.313 –0.113 2.0 2.0 0.356315 0.356283 0.356271 0.356266 0.3562644 1.787 +0.213 2.5 1.5 0.275291 0.275260 0.275249 0.275244 0.2752426 1.381 +0.119 3.0 1.0 0.212692 0.212663 0.212652 0.212648 0.2126467 1.067 –0.067 3.5 0.9 0.164327 0.164301 0.164292 0.164288 0.1642865 0.824 +0.076 4.0 0.6 0.126960 0.126938 0.126929 0.126925 0.1269244 0.637 –0.037 4.5 0.4 0.098090 0.098071 0.098063 0.098060 0.0980591 0.492 –0.092 5.0 0.3 0.075785 0.075768 0.075762 0.075759 0.0757585 0.380 –0.080 b –0.51597 –0.516015 –0.516032 –0.516039 –0.516041 –0.5160 A 5.01566 5.01582 5.01588 5.01591 5.0159173 5.0160

bE / +0.007444 +0.002792 +0.001101 +0.000301 +0.000256 22 / bE 166.881252 166.840459 166.825228 166.819155 166.817429

b –0.000045 –0.000017 –0.000007 –0.000002 –0.0000015 b –0.516015 –0.516032 –0.516039 –0.516041 –0.516042

20

Page 24: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 4. Combinations of A and b to satisfy equal probability equation near minimum E.

X Y Y 0.0 5.0 4.941 5.091 5.016 5.016 4.970 5.110 5.060 4.920 0.5 4.0 3.817 3.933 3.900 3.850 3.828 3.936 3.921 3.813 1.0 2.8 2.950 3.039 3.033 2.955 2.949 3.032 3.039 2.954 1.5 2.2 2.278 2.347 2.358 2.269 2.271 2.335 2.355 2.289 2.0 2.0 1.760 1.814 1.834 1.741 1.750 1.799 1.825 1.774 2.5 1.5 1.360 1.401 1.426 1.337 1.348 1.386 1.414 1.375 3.0 1.0 1.051 1.082 1.109 1.026 1.038 1.067 1.096 1.065 3.5 0.9 0.812 0.836 0.862 0.788 0.800 0.822 0.849 0.826 4.0 0.6 0.627 0.646 0.670 0.604 0.616 0.633 0.658 0.640 4.5 0.4 0.484 0.499 0.521 0.464 0.474 0.488 0.510 0.496 5.0 0.3 0.374 0.386 0.405 0.356 0.365 0.376 0.395 0.384 b –0.5161 –0.5161 –0.5032 –0.5290 –0.522 –0.522 –0.510 –0.510 A 4.941 5.091 5.016 5.016 4.97 5.11 5.06 4.92 E 0.1660 0.1665 0.1664 0.1657 0.1647 0.1668 0.1653 0.1673 Target is E = 0.1660 Table 4. (Continued).

X Y Y 0.0 5.0 5.110 4.930 5.017 5.017 4.970 5.050 5.080 0.5 4.0 3.922 3.833 3.901 3.851 3.871 3.871 3.894 1.0 2.8 3.011 2.981 3.033 2.956 3.014 2.967 2.984 1.5 2.2 2.311 2.318 2.359 2.269 2.348 2.274 2.287 2.0 2.0 1.774 1.802 1.834 1.742 1.828 1.743 1.753 2.5 1.5 1.362 1.401 1.426 1.337 1.424 1.336 1.344 3.0 1.0 1.045 1.090 1.109 1.026 1.109 1.024 1.030 3.5 0.9 0.802 0.847 0.862 0.788 0.864 0.785 0.789 4.0 0.6 0.616 0.659 0.670 0.605 0.673 0.601 0.605 4.5 0.4 0.473 0.512 0.521 0.464 0.524 0.461 0.464 5.0 0.3 0.363 0.398 0.405 0.356 0.408 0.353 0.355 b –0.5290 –0.5032 –0.5032 –0.5290 –0.5000 –0.5320 –0.5320 A 5.11 4.93 5.017 5.017 4.97 5.05 5.08 E 0.1663 0.1650 0.1666 0.1652 0.1662 0.1658 0.1648 Target is E = 0.1660

21

Page 25: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 5. Combinations of A and b to satisfy equal probability equation for 75% probability.

X Y Y 0.0 5.0 5.160 4.870 5.210 4.960 5.080 4.830 5.140 5.130 0.5 4.0 3.986 3.762 3.987 3.796 3.956 3.762 3.904 3.897 1.0 2.8 3.080 2.907 3.051 2.905 3.081 2.930 2.966 2.960 1.5 2.2 2.379 2.246 2.335 2.223 2.400 2.282 2.253 2.248 2.0 2.0 1.838 1.735 1.787 1.701 1.869 1.777 1.711 1.708 2.5 1.5 1.420 1.340 1.368 1.302 1.455 1.384 1.300 1.297 3.0 1.0 1.097 1.035 1.047 0.996 1.134 1.078 0.987 0.985 3.5 0.9 0.848 0.800 0.801 0.763 0.883 0.839 0.750 0.748 4.0 0.6 0.655 0.618 0.613 0.584 0.688 0.654 0.570 0.568 4.5 0.4 0.506 0.477 0.469 0.447 0.535 0.509 0.433 0.432 5.0 0.3 0.391 0.369 0.359 0.342 0.417 0.396 0.329 0.328 b –0.5161 –0.5161 –0.535 –0.535 –0.500 –0.500 –0.550 –0.550 A 5.16 4.87 5.21 4.96 5.08 4.83 5.14 5.13 E 0.2035 0.2052 0.2087 0.2064 0.2044 0.2062 0.2082 0.2080 E = 0.2069 target Table 5. (Continued).

X Y Y 0.0 5.0 5.000 4.840 4.920 4.960 4.860 5.050 4.830 4.910 0.5 4.0 3.914 3.788 3.861 3.796 3.813 3.943 3.771 3.855 1.0 2.8 3.063 2.965 3.029 2.905 2.992 3.078 2.944 3.026 1.5 2.2 2.398 2.321 2.377 2.223 2.348 2.403 2.299 2.376 2.0 2.0 1.877 1.817 1.865 1.701 1.842 1.876 1.795 1.865 2.5 1.5 1.469 1.422 1.463 1.302 1.446 1.465 1.401 1.464 3.0 1.0 1.150 1.113 1.148 0.996 1.134 1.144 1.094 1.149 3.5 0.9 0.900 0.871 0.901 0.763 0.890 0.893 0.854 0.902 4.0 0.6 0.704 0.682 0.707 0.584 0.698 0.697 0.667 0.708 4.5 0.4 0.551 0.534 0.555 0.447 0.548 0.544 0.521 0.556 5.0 0.3 0.431 0.418 0.435 0.342 0.430 0.425 0.407 0.437 b –0.490 –0.490 –0.485 –0.485 –0.485 –0.495 –0.495 –0.484 A 5.00 4.84 4.92 4.96 4.86 5.05 4.83 4.91 E 0.2051 0.2042 0.2047 0.2064 0.2077 0.2074 0.2052 0.2077 E = 0.2069 target

22

Page 26: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 5. (Continued).

X Y Y 0.0 5.0 4.840 5.130 4.900 5.190 4.990 5.210 5.040 5.190 0.5 4.0 3.754 3.979 3.769 3.992 3.809 3.977 3.838 3.952 1.0 2.8 2.912 3.087 2.899 3.070 2.908 3.036 2.922 3.009 1.5 2.2 2.259 2.394 2.229 2.361 2.220 2.318 2.225 2.292 2.0 2.0 1.752 1.857 1.715 1.816 1.695 1.769 1.695 1.745 2.5 1.5 1.359 1.441 1.319 1.397 1.294 1.351 1.290 1.329 3.0 1.0 1.054 1.118 1.014 1.074 0.988 1.031 0.983 1.012 3.5 0.9 0.818 0.867 0.780 0.826 0.754 0.787 0.748 0.770 4.0 0.6 0.634 0.672 0.600 0.636 0.575 0.601 0.570 0.587 4.5 0.4 0.492 0.522 0.462 0.489 0.439 0.459 0.434 0.447 5.0 0.3 0.382 0.405 0.355 0.376 0.335 0.350 0.330 0.340 b –0.508 –0.508 –0.525 –0.525 –0.540 –0.540 –0.545 –0.545 A 4.84 5.13 4.90 5.19 4.99 5.21 5.04 5.19 E 0.2095 0.2074 0.2095 0.2054 0.2089 0.2095 0.2068 0.2058 E = 0.2069 target Table 5. (Continued).

X Y Y 0.0 5.0 4.930 5.200 5.080 5.170 5.016 5.016 5.016 0.5 4.0 3.782 3.989 3.862 3.931 3.924 3.823 3.922 1.0 2.8 2.902 3.061 2.937 2.989 3.070 2.914 3.067 1.5 2.2 2.226 2.348 2.233 2.272 2.402 2.221 2.398 2.0 2.0 1.708 1.802 1.698 1.728 1.879 1.693 1.875 2.5 1.5 1.310 1.382 1.291 1.314 1.470 1.291 1.466 3.0 1.0 1.005 1.060 0.981 0.999 1.150 0.984 1.146 3.5 0.9 0.771 0.814 0.746 0.759 0.900 0.750 0.896 4.0 0.6 0.592 0.624 0.567 0.577 0.704 0.572 0.701 4.5 0.4 0.454 0.479 0.431 0.439 0.551 0.436 0.548 5.0 0.3 0.348 0.367 0.328 0.334 0.431 0.332 0.429 b –0.530 –0.530 –0.548 –0.548 –0.491 –0.543 –0.492 A 4.93 5.20 5.08 5.17 5.016 5.016 5.016 E 0.2068 0.2056 0.2071 0.2062 0.2086 0.2220 0.2037 E = 0.2069 target

23

Page 27: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 6. Combinations of A and b to satisfy equal probability equation for 95% probability.

X Y Y 0.0 5.0 5.260 4.780 5.310 4.860 5.170 4.730 5.320 4.980 0.5 4.0 4.064 3.693 4.064 3.719 4.026 3.684 4.031 3.773 1.0 2.8 3.139 2.853 3.110 2.846 3.136 2.869 3.054 2.859 1.5 2.2 2.425 2.204 2.380 2.178 2.442 2.234 2.314 2.166 2.0 2.0 1.874 1.703 1.821 1.667 1.902 1.740 1.753 1.641 2.5 1.5 1.448 1.315 1.394 1.276 1.481 1.355 1.328 1.243 3.0 1.0 1.118 1.016 1.067 0.976 1.154 1.055 1.006 0.942 3.5 0.9 0.864 0.785 0.816 0.747 0.898 0.822 0.763 0.714 4.0 0.6 0.667 0.607 0.625 0.572 0.700 0.640 0.578 0.541 4.5 0.4 0.516 0.469 0.478 0.438 0.545 0.499 0.438 0.410 5.0 0.3 0.398 0.362 0.366 0.335 0.424 0.388 0.332 0.310 b –0.5161 –0.5161 –0.535 –0.535 –0.500 –0.500 –0.555 –0.555 A 5.26 4.78 5.31 4.86 5.17 4.73 5.31 4.98 E 0.2986 0.2900 0.2946 0.2897 0.2949 0.2956 0.2932 0.2931 E = 0.2961 target Table 6. (Continued).

X Y Y 0.0 5.0 5.120 4.710 5.050 4.710 4.900 5.070 5.290 5.320 4.910 0.5 4.0 4.007 3.687 3.972 3.705 3.874 3.822 3.988 4.051 3.739 1.0 2.8 3.137 2.885 3.125 2.914 3.062 2.882 3.007 3.085 2.847 1.5 2.2 2.455 2.258 2.458 2.293 2.421 2.172 2.267 2.349 2.168 2.0 2.0 1.922 1.768 1.934 1.803 1.914 1.638 1.709 1.789 1.651 2.5 1.5 1.504 1.384 1.521 1.419 1.513 1.235 1.288 1.362 1.257 3.0 1.0 1.177 1.083 1.196 1.116 1.196 0.931 0.971 1.037 0.957 3.5 0.9 0.921 0.848 0.941 0.878 0.946 0.702 0.732 0.790 0.729 4.0 0.6 0.721 0.663 0.740 0.691 0.748 0.529 0.552 0.601 0.555 4.5 0.4 0.564 0.519 0.582 0.543 0.591 0.399 0.416 0.458 0.423 5.0 0.3 0.442 0.406 0.458 0.427 0.467 0.301 0.314 0.349 0.322 b –0.490 –0.490 –0.480 –0.480 –0.470 –0.565 –0.565 –0.545 –0.545 A 5.12 4.71 5.05 4.71 4.90 5.07 5.29 5.32 4.91 E 0.2926 0.2989 0.2981 0.2987 0.2921 0.2944 0.2930 0.2912 0.2944 E = 0.2961 target

24

Page 28: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 6. (Continued).

X Y Y 0.0 5.0 4.720 5.000 4.740 5.140 5.250 4.860 4.770 5.200 5.200 0.5 4.0 3.722 3.943 3.747 3.865 3.948 3.846 3.775 3.907 3.908 1.0 2.8 2.935 3.109 2.963 2.907 2.969 3.044 2.987 2.935 2.936 1.5 2.2 2.315 2.452 2.342 2.186 2.233 2.409 2.364 2.205 2.207 2.0 2.0 1.825 1.934 1.852 1.644 1.679 1.906 1.871 1.656 1.658 2.5 1.5 1.440 1.525 1.464 1.236 1.263 1.508 1.480 1.244 1.246 3.0 1.0 1.135 1.203 1.157 0.930 0.950 1.194 1.172 0.935 0.936 3.5 0.9 0.895 0.948 0.915 0.699 0.714 0.945 0.927 0.702 0.704 4.0 0.6 0.706 0.748 0.723 0.526 0.537 0.748 0.734 0.528 0.529 4.5 0.4 0.557 0.590 0.572 0.395 0.404 0.592 0.581 0.396 0.397 5.0 0.3 0.439 0.465 0.452 0.297 0.304 0.468 0.459 0.298 0.299 b –0.475 –0.475 –0.470 –0.570 –0.570 –0.468 –0.468 –0.572 –0.5715 A 4.72 5.00 4.74 5.14 5.25 4.86 4.77 5.20 5.20 E 0.2948 0.2960 0.2942 0.2967 0.2952 0.2821 0.2887 0.2994 0.2960 E = 0.2961 target Table 6. (Continued).

X Y Y 0.0 5.0 5.190 5.180 5.170 4.800 4.810 4.910 5.320 4.810 5.290 0.5 4.0 3.909 3.901 3.894 3.799 3.806 3.739 4.051 3.700 4.069 1.0 2.8 2.944 2.938 2.933 3.006 3.012 2.847 3.085 2.845 3.129 1.5 2.2 2.217 2.213 2.209 2.379 2.384 2.168 2.349 2.188 2.407 2.0 2.0 1.670 1.667 1.663 1.883 1.886 1.651 1.789 1.683 1.851 2.5 1.5 1.258 1.255 1.253 1.490 1.493 1.257 1.362 1.295 1.424 3.0 1.0 0.947 0.945 0.944 1.179 1.181 0.957 1.037 0.996 1.095 3.5 0.9 0.713 0.712 0.711 0.933 0.935 0.729 0.790 0.766 0.842 4.0 0.6 0.537 0.536 0.535 0.738 0.740 0.555 0.601 0.589 0.648 4.5 0.4 0.405 0.404 0.403 0.584 0.585 0.423 0.458 0.453 0.498 5.0 0.3 0.305 0.304 0.304 0.462 0.463 0.322 0.349 0.348 0.383 b –0.567 –0.567 –0.567 –0.468 –0.468 –0.545 –0.545 –0.525 –0.525 A 5.19 5.18 5.17 4.80 4.81 4.91 5.32 4.81 5.29 E 0.2813 0.2748 0.2756 0.2809 0.2800 0.2944 0.2912 0.2940 0.2991 E = 0.2961 target

25

Page 29: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 6. (Continued).

X Y Y 0.0 5.0 4.820 4.810 4.810 4.810 4.820 4.800 4.790 4.750 5.220 0.5 4.0 3.815 3.807 3.807 3.808 3.816 3.800 3.793 3.685 4.049 1.0 2.8 3.020 3.014 3.013 3.015 3.022 3.009 3.003 2.858 3.141 1.5 2.2 2.390 2.386 2.385 2.387 2.392 2.382 2.377 2.217 2.436 2.0 2.0 1.892 1.888 1.888 1.890 1.894 1.886 1.882 1.720 1.890 2.5 1.5 1.498 1.495 1.494 1.497 1.500 1.493 1.490 1.334 1.466 3.0 1.0 1.186 1.183 1.182 1.185 1.187 1.182 1.180 1.035 1.137 3.5 0.9 0.938 0.937 0.936 0.938 0.940 0.936 0.934 0.803 0.882 4.0 0.6 0.743 0.741 0.741 0.743 0.744 0.741 0.740 0.623 0.684 4.5 0.4 0.588 0.587 0.586 0.588 0.589 0.587 0.586 0.483 0.531 5.0 0.3 0.465 0.464 0.464 0.466 0.467 0.465 0.464 0.375 0.412 b –0.4675 –0.4675 –0.4677 –0.4670 –0.4670 –0.4670 –0.4670 –0.508 –0.508 A 4.82 4.81 4.81 4.81 4.82 4.80 4.79 4.75 5.22 E 0.2818 0.2829 0.2813 0.2853 0.2846 0.2864 0.2882 0.2932 0.2918 E = 0.2961 target Table 7. Combinations of A and b to satisfy equal probability equation for 99% probability.

X Y Y 0.0 5.0 5.350 4.680 5.280 4.640 5.160 4.600 5.420 4.780 0.5 4.0 4.133 3.616 4.112 3.614 4.059 3.618 4.138 3.649 1.0 2.8 3.193 2.793 3.202 2.814 3.193 2.846 3.158 2.786 1.5 2.2 2.467 2.158 2.494 2.192 2.512 2.239 2.411 2.126 2.0 2.0 1.906 1.667 1.942 1.707 1.976 1.761 1.841 1.623 2.5 1.5 1.472 1.288 1.513 1.329 1.554 1.385 1.406 1.239 3.0 1.0 1.137 0.995 1.178 1.035 1.223 1.090 1.073 0.946 3.5 0.9 0.879 0.769 0.806 0.763 0.962 0.857 0.819 0.722 4.0 0.6 0.679 0.594 0.715 0.628 0.756 0.674 0.625 0.551 4.5 0.4 0.524 0.459 0.557 0.489 0.595 0.530 0.477 0.421 5.0 0.3 0.405 0.354 0.433 0.381 0.468 0.417 0.364 0.321 b –0.5161 –0.5161 –0.500 –0.500 –0.480 –0.480 –0.540 –0.540 A 5.35 4.68 5.28 4.64 5.16 4.60 5.42 4.78 E 0.4274 0.4311 0.4301 0.4193 0.4285 0.4259 0.4248 0.4254 E = 0.4232 target

26

Page 30: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 7. (Continued).

X Y Y 0.0 5.0 5.440 4.890 4.960 5.440 5.040 5.420 5.360 5.160 5.220 0.5 4.0 4.111 3.696 3.730 4.091 3.771 4.056 3.991 3.842 4.086 1.0 2.8 3.107 2.793 2.805 3.076 2.822 3.035 2.971 2.860 3.198 1.5 2.2 2.349 2.111 2.109 2.314 2.112 2.271 2.212 2.130 2.503 2.0 2.0 1.775 1.596 1.586 1.740 1.580 1.699 1.647 1.586 1.959 2.5 1.5 1.341 1.206 1.193 1.308 1.182 1.271 1.226 1.180 1.533 3.0 1.0 1.014 0.911 0.897 0.984 0.885 0.951 0.913 0.879 1.200 3.5 0.9 0.766 0.689 0.675 0.740 0.662 0.712 0.680 0.654 0.939 4.0 0.6 0.579 0.521 0.507 0.556 0.495 0.533 0.506 0.487 0.735 4.5 0.4 0.438 0.393 0.382 0.418 0.371 0.399 0.377 0.363 0.576 5.0 0.3 0.331 0.297 0.287 0.315 0.277 0.298 0.281 0.270 0.450 b –0.560 –0.560 –0.570 –0.570 –0.580 –0.580 –0.590 –0.590 –0.490 A 5.44 4.89 4.96 5.44 5.04 5.42 5.36 5.16 5.22 E 0.4193 0.4209 0.4188 0.4239 0.4221 0.4251 0.4241 0.4231 0.4220 E = 0.4232 target Table 7. (Continued).

X Y Y 0.0 5.0 5.280 4.610 4.990 4.800 4.600 5.080 4.620 4.630 4.930 0.5 4.0 3.925 3.663 3.965 3.837 3.637 4.016 3.616 3.696 3.927 1.0 2.8 2.918 2.910 3.150 3.067 2.875 3.175 2.830 2.944 3.128 1.5 2.2 2.169 2.312 2.503 2.451 2.273 2.510 2.215 2.345 2.491 2.0 2.0 1.613 1.837 1.989 1.959 1.797 1.984 1.734 1.868 1.984 2.5 1.5 1.199 1.460 1.580 1.566 1.421 1.569 1.357 1.488 1.581 3.0 1.0 0.891 1.160 1.255 1.252 1.123 1.240 1.062 1.185 1.259 3.5 0.9 0.663 0.921 0.997 1.001 0.888 0.980 0.831 0.944 1.003 4.0 0.6 0.493 0.732 0.792 0.800 0.702 0.775 0.651 0.752 0.799 4.5 0.4 0.366 0.582 0.630 0.639 0.555 0.613 0.509 0.599 0.636 5.0 0.3 0.272 0.462 0.500 0.511 0.439 0.484 0.399 0.477 0.507 b –0.593 –0.460 –0.460 –0.448 –0.470 –0.470 –0.490 –0.455 –0.455 A 5.28 4.61 4.99 4.80 4.60 5.08 4.62 4.63 4.93 E 0.4207 0.4213 0.4264 0.4222 0.4192 0.4222 0.4171 0.4188 0.4252 E = 0.4232 target

27

Page 31: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 7. (Continued).

X Y Y 0.0 5.0 4.730 5.390 4.830 5.440 5.430 4.660 5.310 4.720 5.380 0.5 4.0 3.629 4.135 3.669 4.132 4.124 3.615 4.119 3.630 4.138 1.0 2.8 2.784 3.173 2.787 3.139 3.133 2.804 3.195 2.792 3.183 1.5 2.2 2.136 2.434 2.117 2.384 2.380 2.175 2.478 2.148 2.448 2.0 2.0 1.639 1.867 1.608 1.811 1.807 1.687 1.922 1.652 1.883 2.5 1.5 1.257 1.433 1.221 1.375 1.373 1.309 1.491 1.270 1.448 3.0 1.0 0.965 1.099 0.928 1.045 1.043 1.015 1.157 0.977 1.114 3.5 0.9 0.740 0.843 0.705 0.794 0.792 0.787 0.897 0.751 0.857 4.0 0.6 0.568 0.647 0.535 0.603 0.602 0.611 0.696 0.578 0.659 4.5 0.4 0.436 0.496 0.407 0.458 0.457 0.474 0.540 0.445 0.507 5.0 0.3 0.334 0.381 0.309 0.348 0.347 0.368 0.419 0.342 0.390 b –0.530 –0.530 –0.550 –0.550 –0.550 –0.508 –0.508 –0.525 –0.525 A 4.73 5.39 4.83 5.44 5.43 4.66 5.31 4.72 5.38 E 0.4346 0.4174 0.4246 0.4301 0.4159 0.4221 0.4174 0.4191 0.4259 E = 0.4232 target Table 7. (Continued).

X Y Y 0.0 5.0 4.700 4.680 4.670 4.660 4.860 4.720 4.700 4.750 0.5 4.0 3.753 3.737 3.729 3.721 3.881 3.773 3.757 3.799 1.0 2.8 2.997 2.984 2.978 2.971 3.099 3.016 3.003 3.038 1.5 2.2 2.393 2.383 2.378 2.373 2.474 2.410 2.400 2.429 2.0 2.0 1.911 1.903 1.899 1.895 1.976 1.927 1.919 1.943 2.5 1.5 1.526 1.519 1.516 1.513 1.578 1.540 1.534 1.554 3.0 1.0 1.218 1.213 1.211 1.208 1.260 1.231 1.226 1.243 3.5 0.9 0.973 0.969 0.967 0.965 1.006 0.984 0.980 0.994 4.0 0.6 0.777 0.774 0.772 0.770 0.803 0.786 0.783 0.795 4.5 0.4 0.620 0.618 0.616 0.615 0.641 0.629 0.626 0.635 5.0 0.3 0.495 0.493 0.492 0.491 0.512 0.502 0.500 0.508 b –0.450 –0.450 –0.450 –0.450 –0.450 –0.448 –0.448 –0.447 A 4.70 4.68 4.67 4.66 4.86 4.72 4.70 4.75 E 0.4063 0.4139 0.4183 0.4229 0.4280 0.4159 0.4200 0.4226 E = 0.4232 target

28

Page 32: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table 8. Correlation of E with b using a parabolic model with . 016.5A

b E E E E E –0.577 0.4232 0.4169 –0.560 0.2961 0.2892 0.2992 –0.543 0.2069 0.2051 0.2039 0.2075 –0.529 0.1657 0.1689 0.1616 0.1642 0.1656 –0.5161 0.1521 0.1612 0.1517 0.1523 0.1520 –0.503 0.1664 0.1796 0.1701 0.1677 0.1663 –0.491 0.2069 0.2190 0.2121 0.2061 –0.477 0.2961 0.2925 0.2915 –0.462 0.4232 0.4041

29

Page 33: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

List of Figures 1. Dependence of response variable (Y) on control variable (X). Data from Table 1. Curve

drawn from Eq. (35). 2. Dependence of ln Y on X. Data from Table 1. Line drawn from Eq. (33).

3. Scatter plot for estimated response variable ( Y ) vs. measured response variable (Y). Line represents the 45% diagonal.

4. Equal probability contours between parameters A and b. Contours drawn from Table 5

(75%), Table 6 (95%), and Table 7 (99%) probability levels, respectively. Optimum and standard error values of 075.0016.5 A and 0129.05161.0 b are also shown.

5. Dependence of error sum of squares (E) on exponential parameter (b) for linear parameter A = 5.016. Parabola drawn from Eq. (72).

30

Page 34: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 1. Dependence of response variable (Y) on control variable (X). Data from Table 1. Curve

drawn from Eq. (35).

31

Page 35: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 2. Dependence of ln Y on X. Data from Table 1. Line drawn from Eq. (33).

32

Page 36: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 3. Scatter plot for estimated response variable (Y ) vs. measured response variable (Y).

Line represents 45% diagonal.

33

Page 37: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 4. Equal probability contours between parameters A and b. Contours drawn from Table 5

(75%), Table 6 (95%), and Table 7 (99%) probability levels, respectively. Optimum and standard error values of 075.0016.5 A and 0129.05161.0 b are also shown.

34

Page 38: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 5. Dependence of error sum of squares (E) on the exponential parameter (b) for linear

parameter A = 5.016. Parabola drawn from Eq. (72).

35

Page 39: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

ABE 6933 Special Topics Mathematical and Statistical Characteristics of Nonlinear Regression Models

A. R. Overman

I. Elements of Probability and Calculus A. Arithmetic – the process of counting B. Natural numbers – positive integers ( ,2,1,0 ) C. Rational numbers – ratio of two integers ( ,3/2,3/1,2/1,,1/2,1/1 )

D. Irrational numbers (such as , e, π 2 , etc.)

E. Complex numbers – z = x + i y with i = 1 F. Binomial theorem and Pascal’s triangle (a + b)0 = 1 (a + b)1 = a1 + b1 (a + b)2 = a2 + 2ab + b2 (a + b)3 = a3 + 3a2b + 3ab2 + b3 (a + b)4 = a4 + 4a3b + 6a2b2 + 4ab3 + b4 (a + b)5 = a5 + 5a4b + 10a3b2 + 10a2b3 + 5ab4 + b5 (a + b)6 = a6 + 6a5b + 15a4b2 + 20a3b3 + 15a2b4 + 6ab5 + b6 (a + b)7 = a7 + 7a6b + 21a5b2 + 35a4b3 + 35a3b4 + 21a2b5 + 7ab6 + b7 (a + b)8 = a8 + 8a7b + 28a6b2 + 56a5b3 + 70a4b4 + 56a3b5 + 28a2b6 + 8ab7 + b8 (a + b)9 = a9 + 9a8b + 36a7b2 + 84a6b3 + 126a5b4 + 126a4b5 + 84a3b6 + 36a2b7 + 9ab8 + b9 Note symmetry in the distribution of coefficients for each expansion.

36

Page 40: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Pascal’s triangle for binomial coefficients 1

1 2 1

1 3 3 1

1 4 6 4 1

1 5 10 10 5 1

1 6 15 20 15 6 1

1 7 21 35 35 21 7 1

1 8 28 56 70 56 28 8 1

1 9 36 84 126 126 84 36 9 1

1 10 45 120 210 252 210 120 45 10 1

1 11 55 165 330 462 462 330 165 55 11 1

1 12 66 220 495 792 924 792 495 220 66 12 1

Note the pattern in the coefficients, including symmetry. G. Frequency distributions 1. Discrete distribution Consider the problem of a peg board. This is a two state system – a cell (hole) is either filled or empty. Each cell holds one and only one object (peg), which can be viewed as a type of exclusion principle. Define n as the total number of cells and x as the number of filled cells (pegs). Cells (holes) are indistinguishable (all alike), as are the objects (pegs). Order of filling the cells is irrelevant. Note that a peg board can be linear, triangular, rectangular (Eigen and Winkler, 1993, p. 40; Polster, 2004, p. 33), or even 3-dimensional. The number of distinguishable combinations which are possible for each x, xnc , , can be calculated from (Ruhla, 1992, p. 18; Watkins, 2000, p. 22)

)!(!

!,

xnx

nxnc

(1)

37

Page 41: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

where and is called ‘n factorial’. Note that n can assume positive integers nn 321! ( ) and x can also assume positive integers (,3,2,1n nx ,,2,1,0 ). For small values of n it is easy to estimate c by intuition, but for larger n calculations of c are best performed on a pocket calculator or computer with the algorithm for computations (Eq. (1)) built in. The total number of combinations C for the system is defined as the sum of c values for all values of x, and can be calculated from C = 2n. The frequency distribution of c values is then calculated from f = c/C. Cumulative frequency is calculated from the cumulative sum

fF (2)

so that F is normalized . It should be noted that F forms a discrete set of numbers for a particular case.

10 F

2. Continuous distribution The next step is to compare the discrete distribution to a continuous Gaussian distribution where x is considered a continuous variable and the cumulative distribution is described by

2erf1

2

1 xF (3)

where and are the mean and spread of the distribution. The ‘error function’ is defined by

2

0

2 )exp(π

2

2erf

x

duux

(4)

where represents the Gaussian distribution (bell-shaped curve). Values of the erf can be obtained from mathematical tables (cf. Abramowitz and Stegun, 1965, chp. 7). Some properties of the error function should be noted:

)exp( 2u

erf (0) = 0, erf ( ) = 1, erf (–x) = – erf (+x), erf ( ) = – 1 Equation (3) can be rearranged to the linear form

xFZ

2

1

212erf 1 (5)

where erf–1 is the inverse error function. For example, . Linear regression of Z vs. x leads to values of the parameters

00.1)8427.0(erf 1

and . With these parameters now known the frequency distribution for f vs. x can be calculated for the continuous distribution from

38

Page 42: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

2

2exp

π2

x

f (6)

The procedure can now be applied to a linear peg board, triangular peg board, and square peg board. It can even be applied to a 3–dimensional system. This analysis falls within a branch of mathematics known as group theory. Values of the error function can be calculated from the series approximation (Abramowitz and Stegun, 1965, p. 299)

4432 ]078108.0000972.0230389.0278393.01[

11erf

xxxxx

(7)

for . For the case where erf x is given, the inverse erf –1 and therefore x can be obtained on a scientific calculator or computer using the solver routine. Note that for the case F < 0.5 and 2F–1 < 0 (negative) the procedure is to change the value from – to +, solve for the inverse by Eq. (7) and change the sign from +x to –x. Equation (7) does not work directly for –x because the power series in Eq. (7) is not symmetric.

8.10 x

H. Symmetry and conservation principle In all of the discrete and continuous Gaussian distributions we note symmetry in the distributions around a mean point. A mathematical consequence of this property is that something is conserved (remains constant) in the system. Note that the number of filled cells is defined by x. Since this is a two-state (binary) system (cells are either empty of filled), it follows that the number of unfilled cells is n – x. The total capacity of the system is the sum of filled and unfilled cells so that total capacity is = x + n – x = n. While this is obvious for our case, it illustrates the connection between symmetry and conservation. This property turns out to be very important in the various models of physics (including mechanics, electromagnetism, relativity, and quantum mechanics). It also shows up in chemistry and biology.

39

Page 43: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 1. Linear pegboard

40

Page 44: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 2. Triangular pegboard

41

Page 45: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Figure 3. Square pegboard

42

Page 46: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Frequency distributions for a linear peg board Case 1 x 2 In this case n = 1 x 2 = 2 and x = 0, 1, 2. Table A1. Frequency distribution for the linear peg board with a 1 x 2 array.

x c f F Z c f ˆ 0.0000 0 1 0.2500 0.2167 0.867 0.2500 –0.4767 1 2 0.5000 0.5379 2.152 0.7500 +0.4767 2 1 0.2500 0.2167 0.867 1.0000 C 4 = 22 Note symmetry in the frequency distribution.

xxFZ 9534.09534.02

1

212erfˆ 1

r = 1

000.1,0489.12

22

0489.1

00.1exp5379.0

2exp

π2

1ˆ xx

f

ff 2848.11045.0ˆ r = 1

fc ˆ4ˆ

43

Page 47: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 1 x 3 In this case n = 1 x 3 = 3 and x = 0, 1, 2, 3. Table A2. Frequency distribution for the linear peg board with a 1 x 3 array.

x c f F Z f c 0.0000 0 1 0.1250 0.1037 0.83 0.1250 –0.8142 1 3 0.3750 0.3888 3.11 0.5000 0.0000 2 3 0.3750 0.3888 3.11 0.8750 +0.8142 3 1 0.1250 0.1037 0.83 1.0000 C 8 = 23 Note symmetry in the frequency distribution.

xxFZ 8142.02213.12

1

212erfˆ 1

r = 1.0000

500.1,2282.12

22

2282.1

50.1exp4594.0

2exp

π2

1ˆ xxf

ff 1404.103885.0ˆ r = 1.0000

fc ˆ8ˆ

44

Page 48: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 1 x 4 In this case n = 1 x 4 = 4 and x = 0, 1, 2, 3, 4. Table A3. Frequency distribution for the linear peg board with a 1 x 4 array.

x c f F Z f c 0.0000 0 1 0.0625 0.0512 0.82 0.0625 –1.0842 1 4 0.2500 0.2419 3.87 0.3125 –0.3452 2 6 0.3750 0.4060 6.50 0.6875 +0.3452 3 4 0.2500 0.2419 3.87 0.9375 +1.0842 4 1 0.0625 0.0512 0.82 1.0000 C 16 = 24 Note symmetry in the frequency distribution.

xxFZ 7196.04391.12

1

212erfˆ 1

r = 0.999909

000.2,3897.12

22

3897.1

00.2exp4060.0

2exp

π2

1ˆ xxf

ff 1052.10226.0ˆ r = 0.99710

fc ˆ16ˆ

45

Page 49: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 1 x 5 In this case n = 1 x 5 = 5 and x = 0, 1, 2, 3, 4, 5. Table A4. Frequency distribution for the linear peg board with a 1 x 5 array.

x c f F Z f c 0.00000 0 1 0.03125 0.02590 0.83 0.03125 –1.3148 1 5 0.15625 0.14143 4.53 0.18750 –0.6277 2 10 0.31250 0.33050 10.58 0.50000 0.0000 3 10 0.31250 0.33050 10.58 0.81250 +0.6277 4 5 0.15625 0.14143 4.53 0.96875 +1.3148 5 1 0.03125 0.02590 0.83 1.00000 C 32 = 25 Note symmetry in the frequency distribution.

xxFZ 65146.06286.12

1

212erfˆ 1

r = 0.99983

500.2,5350.12

22

5350.1

50.2exp3675.0

2exp

π2

1ˆ xxf

ff 0882.101543.0ˆ r = 0.99723

fc ˆ32ˆ

46

Page 50: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 1 x 6 In this case n = 1 x 6 = 6 and x = 0, 1, 2, 3, 4, 5, 6. Table A5. Frequency distribution for the linear peg board with a 1 x 6 array.

x c f F Z f c 0.000000 0 1 0.015625 0.013247 0.85 0.015625 –1.5213 1 6 0.093750 0.080181 5.13 0.109375 –0.8703 2 15 0.234375 0.236181 15.11 0.343750 –0.2841 3 20 0.312500 0.338560 21.67 0.656250 +0.2841 4 15 0.234375 0.236181 15.11 0.890625 +0.8703 5 6 0.093750 0.080181 5.13 0.984375 +1.5213 6 1 0.015625 0.013247 0.85 1.000000 C 64 = 26 Note symmetry in the frequency distribution.

xxFZ 60009.08003.12

1

212erfˆ 1

r = 0.99975

000.3,66643.12

22

66643.1

00.3exp33856.0

2exp

π2

1ˆ xxf

ff 0809.101187.0ˆ r = 0.99730

fc ˆ64ˆ

47

Page 51: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 1 x 7 In this case n = 1 x 7 = 7 and x = 0, 1, 2, 3, 4, 5, 6, 7. Table A6. Frequency distribution for the linear peg board with a 1 x 7 array.

x c f F Z f 0.0000000 0 1 0.0078125 0.008580 0.0078125 –1.7123 1 7 0.0546875 0.049286 0.0625000 –1.0842 * 2 21 0.1640625 0.158080 0.2265625 –0.5305 3 35 0.2734375 0.283108 0.5000000 0.0000 4 35 0.2734375 0.283108 0.7734375 +0.5305 5 21 0.1640625 0.158080 0.9375000 +1.0842 * 6 7 0.0546875 0.049286 0.9921875 +1.7123 7 1 0.0078125 0.008580 1.0000000 C 128 = 27 Note symmetry in the frequency distribution.

xxFZ 53978.088923.12

1

212erfˆ 1

r = 0.999963

500.3,8526.12

22

8526.1

50.3exp3045.0

2exp

π2

1ˆ xxf

ff 0689.1011875.0ˆ r = 0.999198

48

Page 52: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 1 x 8 In this case n = 1 x 8 = 8 and x = 0, 1, 2, 3, 4, 5, 6, 7, 8. Table A7. Frequency distribution for the linear peg board with a 1 x 8 array.

x c f F Z f 0.000000 0 1 0.003906 0.004626 0.5 0.003906 –1.8934 0.012167 1 8 0.031250 0.028129 1.5 0.035156 –1.2777 * 0.057167 2 28 0.109375 0.102126 2.5 0.144531 –0.7504 0.160373 3 56 0.218750 0.221375 3.5 0.363281 –0.2470 0.268612 4 70 0.273438 0.286500 4.5 0.636719 +0.2470 0.268612 5 56 0.218750 0.221375 5.5 0.855469 +0.7504 0.160373 6 28 0.109375 0.102126 6.5 0.964844 +1.2777 * 0.057167 7 8 0.031250 0.028129 7.5 0.996094 +1.8934 0.012167 8 1 0.003906 0.004626 1.000000 C 256 = 28 Note symmetry in the frequency distribution.

xxFZ 5078.00312.22

1

212erfˆ 1

r = 0.999946

0000.4,9692.12

22

9692.1

00.4exp2865.0

2exp

π2

1ˆ xxf

ff 0589.1008700.0ˆ r = 0.999084

49

Page 53: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Frequency distributions of a triangular peg board Case 1: 1 Cell (n = 1) For this case n = 1 and x = 0 or 1. The distribution is given in Table A8. Table A8. Frequency distribution for a triangular peg board with 1 cell. x c f F 0.000 0 1 0.500 0.500 1 1 0.500 1.000 C 2 = 21 Case 2: 3 Cells (n = 3) For this case n = 3 and x = 0, 1, 2, or 3. The pegboard is shown in the diagram. The corresponding distribution is given in Table A9. Table A9. Frequency distribution for a triangular peg board with 3 cells.

x c f F Z f 0.0000 0 1 0.1250 0.1034 0.5 0.1250 –0.8142 0.2367 1 3 0.3750 0.3892 1.5 0.5000 0.0000 0.4594 2 3 0.3750 0.3892 2.5 0.8750 +0.8142 0.2367 3 1 0.1250 0.1034 1.0000 C 8 = 23

xxFZ 8142.02213.12

1

2)12(erfˆ 1

r = 1.0000

500.1,2282.12

22

2282.1

500.1exp4594.0

2exp

π2

1ˆ xxf

ff 1432.103950.0ˆ r = 1.000000

50

Page 54: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

This system is an example of group theory in mathematics, which links principles of symmetry and conservation. Note symmetry of c around the mean value of x ( 50.1 ). Conservation comes from Filled cells + Unfilled cells = x + (n – x) = n = total capacity of the system (number of cells) Case 3: 6 Cells (n = 6) For this case n = 6 and x = 0, 1, 2, 3, 4, 5, 6. The distribution is given in Table A10. Table A10. Frequency distribution for a triangular peg board with 6 cells.

x c f F Z f 0.000000 0 1 0.015625 0.013246 0.015625 –1.5213 1 6 0.093750 0.080177 0.109375 –0.8703 2 15 0.234375 0.236178 0.343750 –0.2841 3 20 0.312500 0.338560 0.656250 +0.2841 4 15 0.234375 0.236178 0.890625 +0.8703 5 6 0.093750 0.080177 0.984375 +1.5213 6 1 0.015625 0.013246 1.000000 C 64 = 26

xxFZ 6001.08003.12

1

2)12(erfˆ 1

r = 0.999748

000.3,6664.12

22

6664.1

00.3exp33856.0

2exp

π2

1ˆ xxf

ff 0809.101187.0ˆ r = 0.99730

51

Page 55: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 4: 10 Cells (n = 10) For this case n = 10 and x = 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10. The distribution is given in Table A11. Table A11. Frequency distribution for a triangular peg board with 10 cells.

x c f F Z f 0.0000000 0 1 0.0009766 0.001342 0.0009766 1 10 0.0097656 0.008921 0.0107422 –1.6260 2 45 0.0439453 0.038930 0.0546875 –1.1311 3 120 0.1171875 0.111513 0.1718750 –0.6700 4 210 0.2050781 0.209679 0.3769531 –0.2215 5 252 0.2460938 0.258800 0.6230469 +0.2215 6 210 0.2050781 0.209679 0.8281250 +0.6700 7 120 0.1171875 0.111513 0.9453125 +1.1311 8 45 0.0439453 0.038930 0.9892578 +1.6260 9 10 0.0097656 0.008921 0.9990234 10 1 0.0009766 0.001342 1.0000000 C 1024 = 210

xxFZ 4588.02939.22

1

2)12(erfˆ 1

r = 0.99988

000.5,1797.22

22

1797.2

000.5exp2588.0

2exp

π2

1ˆ xxf

ff 0383.100352.0ˆ r = 0.99894

52

Page 56: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 5: 15 Cells (n = 15) For this case n = 15 and x = . The distribution is given in Table A12. 15,,3,2,1,0

Table A12. Frequency distribution for a triangular peg board with 15 cells.

x c f F Z f 0.0000000 0 1 0.0000305 0.000096 0.0000305 1 15 0.0004578 0.000652 0.0004883 2 105 0.0032043 0.003355 * 0.0036926 3 455 0.0138855 0.013138 0.0175781 –1.4875 4 1365 0.0416565 0.039158 0.0592346 –1.1032 5 3003 0.0916443 0.088829 0.1508789 –0.7309 6 5005 0.1527405 0.153358 0.3036194 –0.3630 7 6435 0.1963806 0.201504 0.5000000 0.0000 8 6435 0.1963806 0.201504 0.6963806 +0.3630 9 5005 0.1527405 0.153358 0.8491211 +0.7309 10 3003 0.0916443 0.088829 0.9407654 +1.1032 11 1365 0.0416565 0.039158 0.9824219 +1.4875 12 455 0.0138855 0.013138 0.9963074 13 105 0.0032043 0.003355 * 0.9995117 14 15 0.0004578 0.000652 0.9999695 15 1 0.0000305 0.000096 1.0000000 C 32768 = 215

xxFZ 3695.07711.22

1

2)12(erfˆ 1

r = 0.999972

500.7,7065.22

53

Page 57: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

22

7065.2

500.7exp2085.0

2exp

π2

1ˆ xxf

ff 02423.1002046.0ˆ r = 0.999637

54

Page 58: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 6: 21 Cells (n = 21) For this case n = 21 and x = . The distribution is given in Table A13. 21,,3,2,1,0

Table A13. Frequency distribution for a triangular peg board with 21 cells.

x c f F Z f 0.000000 0 1 0.000000 0.000004 0.000000 1 21 0.000010 0.000028 0.000010 2 210 0.000100 0.000159 0.000110 3 1,330 0.000634 0.000749 0.000744 4 5,985 0.002854 0.002913 * 0.003598 5 20,349 0.009703 0.009333 0.013301 –1.5668 6 54,264 0.025875 0.024629 0.039176 –1.2429 7 116,280 0.055447 0.053525 0.094623 –0.9287 8 203,490 0.097032 0.095807 0.191655 –0.6168 9 293,930 0.140157 0.141240 0.331812 –0.3071 10 352,716 0.168188 0.171489 0.500000 0.0000 11 352,716 0.168188 0.171489 0.668188 +0.3071 12 293,930 0.140157 0.141240 0.808345 +0.6168 13 203,490 0.097032 0.095807 0.905377 +0.9287 14 116,280 0.055447 0.053525 0.960824 +1.2429 15 54,264 0.025875 0.024629 0.986699 +1.5668 16 20,349 0.009703 0.009333 0.996402 17 5,985 0.002854 0.002913 * 0.999256 18 1,330 0.000634 0.000749 0.999890

55

Page 59: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

19 210 0.000100 0.000159 0.999990 20 21 0.000010 0.000028 1.000000 21 1 0.000000 0.000004 1.000000 C 2,097,152 = 221

xxFZ 3115.02707.32

1

2)12(erfˆ 1

r = 0.999982

500.10,2103.32

22

2103.3

500.10exp1757.0

2exp

π2

1ˆ xxf

ff 0183.100135.0ˆ r = 0.99980

56

Page 60: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 7: 28 Cells (n = 28) For this case n = 28 and x = . The distribution is given in Table A14. 28,,3,2,1,0

Table A14. Frequency distribution for a triangular peg board with 28 cells.

x c f F Z f 0.0000000 0 1 0.0000000 0.000000 0.0000000 1 28 0.0000001 0.000001 0.0000001 2 378 0.0000014 0.000004 0.0000015 3 3,276 0.0000122 0.000022 0.0000137 4 20,475 0.0000763 0.000104 0.0000900 5 98,280 0.0003661 0.000414 0.0004561 6 376,740 0.0014035 0.001431 * 0.0018596 7 1,184,040 0.0044109 0.004273 0.0062705 –1.7706 8 3,108,105 0.0115786 0.011029 0.0178491 –1.4830 9 6,906,900 0.0257302 0.024603 0.0435793 –1.2079 10 13,123,110 0.0488874 0.047433 0.0924667 –0.9378 11 21,474,180 0.0799975 0.079034 0.1724642 –0.6684 12 30,421,755 0.1133299 0.113814 0.2857941 –0.3996 13 37,442,160 0.1394829 0.141652 0.4252770 –0.1335 14 40,116,600 0.1494460 0.152370 0.5747230 +0.1335 15 37,442,160 0.1394829 0.141652 0.7142059 +0.3996 16 30,421,755 0.1133299 0.113814 0.8275358 +0.6684 17 21,474,180 0.0799975 0.079034 0.9075333 +0.9378 18 13,123,110 0.0488874 0.047433 0.9564207 +1.2079

57

Page 61: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

19 6,906,900 0.0257302 0.024603 0.9821509 +1.4830 20 3,108,105 0.0115786 0.011029 0.9937295 +1.7706 21 1,184,040 0.0044109 0.004273 0.9981404 22 376,740 0.0014035 0.001431 * 0.9995439 23 98,280 0.0003661 0.000414 0.9999100 24 20,475 0.0000763 0.000104 0.9999863 25 3,276 0.0000122 0.000022 0.9999985 26 378 0.0000014 0.000004 0.9999999 27 28 0.0000001 0.000001 1.0000000 28 1 0.0000000 0.000000 1.0000000 C 268,435,456 = 228

xxFZ 27007.07810.32

1

2)12(erfˆ 1

r = 0.999974

000.14,70275.32

22

70275.3

00.14exp15237.0

2exp

π2

1ˆ xxf

ff 01674.1000995.0ˆ r = 0.99985

58

Page 62: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Frequency distributions for a square peg board. Case 1: 2 x 2 For this case n = 2 x 2 = 4 and x assumes values of 0, 1, 2, 3, and 4. Corresponding values of are calculated from Eq. (1). Results are given in Table A15 xnc , Table A15. Frequency distribution for the square peg board with a 2 x 2 array.

x c f F Z f 0.0000 0 1 0.0625 0.05117 0.0625 –1.0842 1 4 0.2500 0.24191 0.3125 –0.3452 2 6 0.3750 0.40600 0.6875 +0.3452 3 4 0.2500 0.24191 0.9375 +1.0842 4 1 0.0625 0.05117 1.0000 C 16 = 24 Note the symmetry in the discrete frequency distribution.

xxFZ 7196.04391.12

1

212erfˆ 1

r = 0.999909

000.2,3897.12

22

3897.1

00.2exp4060.0

2exp

π2

1ˆ xxf

ff 1053.102262.0ˆ r = 0.9971

59

Page 63: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 2: 3 x 3 For this case n = 3 x 3 = 9 and x assumes values of 0, 1, 2, 3, . Corresponding values of

are calculated from Eq. (1). Results are given in Table A16.

9,

xnc , Table A16. Frequency distribution for the square peg board with a 3 x 3 array.

x c f F Z f 0.000000 0 1 0.001953 0.002497 * 0.001953 1 9 0.017578 0.015920 0.019531 –1.4568 2 36 0.070312 0.063869 0.089843 –0.9491 3 84 0.164063 0.161257 0.253906 –0.4680 4 126 0.246094 0.256230 0.500000 0.0000 5 126 0.246094 0.256230 0.746094 +0.4680 6 84 0.164063 0.161257 0.910157 +0.9491 7 36 0.070312 0.063869 0.980469 +1.4568 8 9 0.017578 0.015920 0.998047 9 1 0.001953 0.002497 * 1.000000 Total 512 = 29 Again note the symmetry in the discrete frequency distribution.

xxFZ 4812.01653.22

1

212erfˆ 1

r = 0.999919

500.4,0782.22

22

0782.2

50.4exp2715.0

2exp

π2

1ˆ xxf

ff 0369.100373.0ˆ r = 0.99895

60

Page 64: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 3: 4 x 4 For this case n = 4 x 4 = 16 and x assumes values of 0, 1, 2, 3, . Corresponding values

of are calculated from Eq. (1). Results are given in Table A17.

16,

xnc , Table A17. Frequency distribution for the square peg board with a 4 x 4 array.

x c f F Z f 0.0000000 0 1 0.0000153 0.000044 0.0000153 1 16 0.0002441 0.000319 0.0002594 2 120 0.0018311 0.001966 * 0.0020905 3 560 0.0085449 0.008102 0.0106354 –1.6287 4 1820 0.0277710 0.025807 0.0384064 –1.2493 5 4368 0.0666504 0.063543 0.1050568 –0.8868 6 8008 0.1221924 0.120946 0.2272492 –0.5288 7 11440 0.1745605 0.177953 0.4018097 –0.1759 8 12870 0.1963806 0.202400 0.5981903 +0.1759 9 11440 0.1745605 0.177953 0.7727508 +0.5288 10 8008 0.1221924 0.120946 0.8949432 +0.8868 11 4368 0.0666504 0.063543 0.9615936 +1.2493 12 1820 0.0277710 0.025807 0.9893646 +1.6287 13 560 0.0085449 0.008102 0.9979095 14 120 0.0018311 0.001966 * 0.9997406 15 16 0.0002441 0.000369 0.9999847 16 1 0.0000153 0.000053 1.0000000 C 65536 = 216 Note the symmetry in the frequency distribution.

61

Page 65: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

xxFZ 3588.08703.22

1

212erfˆ 1

r = 0.999958

000.8,7872.22

22

7872.2

00.8exp2024.0

2exp

π2

1ˆ xxf

ff 0248.100194.0ˆ r = 0.99962 Several characteristics should be noted from these calculations. First, note the symmetry in the distributions in the tables. Second, note that the continuous Gaussian function approximates the discrete distributions rather well. Third, as the number of values increases, agreement between the discrete and continuous distributions improves.

62

Page 66: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 3: 5 x 5 For this case n = 5 x 5 = 25 and x assumes values of 0, 1, 2, 3, . Corresponding values

of are calculated from Eq. (1). Results are given in Table A18.

25,

xnc , Table A18. Frequency distribution for the square peg board with a 5 x 5 array.

x c f F Z f 0.00000000 0 1 0.00000003 0.000000 0.00000003 1 25 0.00000075 0.000003 0.00000078 2 300 0.00000894 0.000020 0.00000972 3 2,300 0.00006855 0.000101 0.00007827 4 12,650 0.00037700 0.000439 0.00045527 5 53,130 0.00158340 0.001624 * 0.00203867 6 177,100 0.00527799 0.005101 0.00731666 –1.7297 7 480,700 0.01432598 0.013604 0.02164264 –1.4266 8 1,081,575 0.03223345 0.030810 0.05387609 –1.1362 9 2,042,975 0.06088540 0.059254 0.11476149 –0.8503 10 3,268,760 0.09741664 0.096770 0.21217813 –0.5650 11 4,457,400 0.13284087 0.134200 0.34501900 –0.2816 12 5,200,300 0.15498102 0.158037 0.50000002 0.0000 13 5,200,300 0.15498087 0.158037 0.65498089 +0.2816 14 4,457,400 0.13284087 0.134200 0.78782176 +0.5650 15 3,268,760 0.09741664 0.096770 0.88523840 +0.8503 16 2,042,975 0.06088540 0.059254 0.94612380 +1.1362 17 1,081,575 0.03223345 0.030810 0.97835725 +1.4266 18 480,700 0.01432598 0.013604

63

Page 67: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.99268323 +1.7297 19 177,100 0.00527799 0.005101 0.99796122 20 53,130 0.00158340 0.001624 * 0.99954462 21 12,650 0.00037700 0.000439 0.99992162 22 2,300 0.00006855 0.000101 0.99999017 23 300 0.00000894 0.000020 0.99999911 24 25 0.00000075 0.000003 0.99999986 25 1 0.00000003 0.000000 0.99999989 C 33,554,432 = 225 Note the symmetry in the frequency distribution.

xxFZ 2859.05740.32

1

212erfˆ 1

r = 0.999974

50.12

4975.32

22

4975.3

50.12exp1613.0

2exp

π2

1ˆ xxf

ff 017789.1001129.0ˆ r = 0.999827

64

Page 68: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 4: 6 x 6 For this case n = 6 x 6 = 36 and x assumes values of 0, 1, 2, 3, . Corresponding values

of are calculated from Eq. (1). Results are given in Table A19.

36,

xnc , Table A19. Frequency distribution for the square peg board with a 6 x 6 array.

x c f F Z f 0.000000 0 1 0.000000 1 36 0.00000000 0.000000 0.00000000 2 630 0.00000001 0.000000 0.00000001 3 7,140 0.00000010 0.000000 0.00000010 4 58,905 0.00000086 0.000002 0.00000096 5 376,992 0.00000549 0.000010 0.00000645 6 1,947,792 0.00002834 0.000039 0.00003479 7 8,347,680 0.00012147 0.000144 0.00015626 8 30,260,340 0.00044035 0.000472 0.00059661 9 94,143,280 0.00136997 0.001380 * 0.00196658 10 254,186,856 0.00369891 0.003606 0.00566549 –1.7970 11 600,805,296 0.00874287 0.008415 0.01440836 –1.5442 12 1,251,677,700 0.01821431 0.017541 0.03262267 –1.3013 13 2,310,789,600 0.03362641 0.032657 0.06624908 –1.0633 14 3,796,297,200 0.05524340 0.054303 0.12149248 –0.8263 15 5,567,902,560 0.08102365 0.080647 0.20251613 –0.5890 16 7,307,872,110 0.10634354 0.106974 0.30885967 –0.3524 17 8,597,496,600 0.12511004 0.126733 0.43396971 –0.1179 18 9,075,135,300 0.13206060 0.134100

65

Page 69: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.56603031 +0.1179 19 8,597,496,600 0.12511004 0.126733 0.69114035 +0.3524 20 7,307,872,110 0.10634354 0.106974 0.79748389 +0.5890 21 5,567,902,560 0.08102365 0.080647 0.87850754 +0.8263 22 3,796,297,200 0.05524340 0.054303 0.93375094 +1.0633 23 2,310,789,600 0.03362641 0.032657 0.96737735 +1.3013 24 1,251,677,700 0.01821431 0.017541 0.98559166 +1.5442 25 600,805,296 0.00874287 0.008415 0.99433453 +1.7970 26 254,186,856 0.00369891 0.003606 0.99803344 27 94,143,280 0.00136997 0.001380 * 0.99940341 28 30,260,340 0.00044035 0.000472 0.99984376 29 8,347,680 0.00012147 0.000144 0.99996523 30 1,947,792 0.00002834 0.000039 0.99999357 31 376,992 0.00000549 0.000010 0.99999906 32 58,905 0.00000086 0.000002 0.99999992 33 7,140 0.00000010 0.000000 1.00000002 34 630 0.00000001 1.00000003 35 36 0.00000000 1.00000003 36 1 0.00000000 1.00000003 C 68,719,476,736 = 236 Note the symmetry in the frequency distribution.

66

Page 70: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

xxFZ 2377.02786.42

1

212erfˆ 1

r = 0.999981

00.18,2070.42

22

2070.4

00.18exp1341.0

2exp

π2

1ˆ xxf

ff 01356.1000723.0ˆ r = 0.999909

67

Page 71: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 4: 7 x 7 For this case n = 7 x 7 = 49 and x assumes values of 0, 1, 2, 3, . Corresponding values

of are calculated from Eq. (1). Results are given in Table A20.

49,

xnc , Table A20. Frequency distribution for the square peg board with a 7 x 7 array.

x c f F Z f 0.000000 0 1 0.000000 0.000000 1 49 0.000000 0.000000 0.000000 2 1,176 0.000000 0.000000 0.000000 3 18,424 0.000000 0.000000 0.000000 4 211,876 0.000000 0.000002 0.000000 5 1,906,884 0.000000 0.000000 0.000000 6 13,983,816 0.000000 0.000000 0.000000 7 85,900,584 0.000000 0.000000 0.000000 8 450,978,066 0.000001 0.000002 0.000001 9 2,054,455,634 0.000004 0.000006 0.000005 10 8,217,822,536 0.000015 0.000020 0.000020 11 29,135,916,264 0.000052 0.000064 0.000072 12 92,263,734,836 0.000164 0.000185 0.000236 13 262,596,783,764 0.000466 0.000497 0.000702 14 675,248,872,536 0.001199 0.001228 * 0.001901 15 1,575,580,703,000 0.002799 0.002795 0.004700 16 3,348,108,993,000 0.005947 0.005859 0.010647 –1.6285 17 6,499,270,398,000 0.011545 0.011315 0.022192 –1.4192 18 11,554,258,486,000 0.020524 0.020124

68

Page 72: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.042716 –1.2145 19 18,851,684,898,000 0.033487 0.032967 0.076203 –1.0119 20 28,277,527,346,000 0.050231 0.049740 0.126434 –0.8093 21 39,049,918,716,000 0.069367 0.069120 0.195801 –0.6061 22 49,699,896,548,000 0.088285 0.088467 0.284086 –0.4032 23 58,343,356,817,000 0.103639 0.104288 0.387725 –0.2016 24 63,205,303,219,000 0.112275 0.113230 0.500000 0.0000 25 63,205,303,219,000 0.112275 0.113230 0.612275 +0.2016 26 58,343,356,817,000 0.103639 0.104288 0.715914 +0.4032 27 49,699,896,548,000 0.088285 0.088467 0.804199 +0.6061 28 39,049,918,716,000 0.069367 0.069120 0.873566 +0.8093 29 28,277,527,346,000 0.050231 0.049740 0.923797 +1.0119 30 18,851,684,898,000 0.033487 0.032967 0.957284 +1.2145 31 11,554,258,486,000 0.020524 0.020124 0.977808 +1.4192 32 6,499,270,398,000 0.011545 0.011315 0.989353 +1.6285 33 3,348,108,993,000 0.005947 0.005859 0.995300 34 1,575,580,703,000 0.002799 0.002795 0.998099 35 675,248,872,536 0.001199 0.001228 * 0.999298 36 262,596,783,764 0.000466 0.000497 0.999764 37 92,263,734,836 0.000164 0.000185 0.999928 38 29,135,916,264 0.000052 0.000064 0.999980 39 8,217,822,536 0.000015 0.000020 0.999995 40 2,054,455,634 0.000004 0.000006 0.999999 41 450,978,066 0.000001 0.000002

69

Page 73: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

1.000000 42 85,900,584 0.000000 0.000000 43 13,983,816 44 1,906,884 45 211,876 46 18,424 47 1,176 48 49 49 1 C 562,949,953,421,000 = 249 Note the symmetry in the frequency distribution.

xxFZ 20281.09687.42

1

212erfˆ 1

r = 0.9999962

50.24,9308.42

22

9308.4

50.24exp1144.0

2exp

π2

1ˆ xxf

ff 00727.1000345.0ˆ r = 0.999965

70

Page 74: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 5: 8 x 8 For this case n = 8 x 8 = 64 and x assumes values of 0, 1, 2, 3, . Corresponding values

of are calculated from Eq. (1). Results are given in Table A21.

64,

xnc , Table A21. Frequency distribution for the square peg board with a 8 x 8 array.

x c f F Z f 0.000000 0 1 0.000000 0.000000 1 64 0.000000 0.000000 0.000000 2 3 4 7 8 0.00000000 1018 9 0.00000003 1018 10 0.00000015 1018 11 0.00000074 1018 0.0000000 0.0000000 12 0.00000328 1018 0.0000002 0.000000 0.0000002 13 0.00001314 1018 0.0000007 0.000001 0.0000009 14 0.00004786 1018 0.0000026 0.000004 0.0000035 15 0.00015952 1018 0.0000086 0.000011 0.0000121 16 0.00048853 1018 0.0000265 0.000032 0.0000386 17 0.00137937 1018 0.0000748 0.000084 0.0001134 18 0.00360169 1018 0.0001952 0.000210 0.0003086 19 0.00871988 1018 0.0004727 0.000491 0.0007813

71

Page 75: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

20 0.01961973 1018 0.0010636 0.001079 * 0.0018449 21 0.04110800 1018 0.0022285 0.002224 0.0040734 22 0.08034745 1018 0.0043556 0.004306 0.0084390 –1.6919 23 0.14672143 1018 0.0079538 0.007828 0.0163828 –1.5077 24 0.25064910 1018 0.0135877 0.013363 0.0299705 –1.3279 25 0.40103857 1018 0.0217403 0.021421 0.0517108 –1.1504 26 0.60155785 1018 0.0326105 0.032244 0.0843213 –0.9736 27 0.84663698 1018 0.0458963 0.045576 0.1302176 –0.7965 28 1.11877029 1018 0.0606487 0.060493 0.1908663 –0.6189 29 1.38881829 1018 0.0752880 0.075395 0.2661543 –0.4412 30 1.62028801 1018 0.0878360 0.088238 0.3539903 –0.2645 31 1.77709008 1018 0.0963362 0.096971 0.4503265 –0.0887 32 1.83262414 1018 0.0993468 0.100070 0.5496733 +0.0887 33 1.77709008 1018 0.0963362 0.096971 0.6460095 +0.2645 34 1.62028801 1018 0.0878360 0.088238 0.7338455 +0.4412 35 1.38881829 1018 0.0752880 0.075395 0.8091335 +0.6189 36 1.11877029 1018 0.0606487 0.060493 0.8697822 +0.7965 37 0.84663698 1018 0.0458963 0.045576 0.9156785 +0.9736 38 0.60155785 1018 0.0326105 0.032244 0.9482890 +1.1504 39 0.40103857 1018 0.0217403 0.021421 0.9700293 +1.3279 40 0.25064910 1018 0.0135877 0.013363 0.9836170 +1.5077 41 0.14672143 1018 0.0079538 0.007828 0.9915708 +1.6919 42 0.08034745 1018 0.0043556 0.004306 0.9959264

72

Page 76: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

43 0.04110800 1018 0.0022285 0.002224 0.9981549 44 0.01961973 1018 0.0010636 0.001079 * 0.9992185 45 0.00871988 1018 0.0004727 0.000491 0.9996912 46 0.00360169 1018 0.0001952 0.000210 0.9998864 47 0.00137937 1018 0.0000748 0.000084 0.9999612 48 0.00048853 1018 0.0000265 0.000032 0.9999877 49 0.00015952 1018 0.0000086 0.000011 0.9999963 50 0.00004786 1018 0.0000026 0.000004 0.9999989 51 0.00001314 1018 0.0000007 0.000001 0.9999996 52 0.00000328 1018 0.0000002 0.000000 0.9999998 53 0.00000074 1018 0.0000000 0.9999998 54 0.00000015 1018 55 0.00000003 1018 56 0.00000000 1018 57 58 59 60 61 62 63 64 64 1 C 18.4467441 1018 = 264

73

Page 77: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Note the symmetry in the frequency distribution.

xxFZ 17737.067576.52

1

212erfˆ 1

r = 0.9999963

000.32,6380.52

22

6380.5

00.32exp10007.0

2exp

π2

1ˆ xxf

ff 00622.1000252.0ˆ r = 0.999978

74

Page 78: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 6: 9 x 9 For this case n = 9 x 9 = 81 and x assumes values of 0, 1, 2, 3, . Corresponding values

of are calculated from Eq. (1). Results are given in Table A22.

81,

xnc , Table A22. Frequency distribution for the square peg board with a 9 x 9 array.

x c f F Z f 0.000000 0 1 0.000000 0.000000 1 81 0.000000 0.000000 0.000000 2 3 4 5 6 14 0.00000002 1023 15 0.00000008 1023 16 0.00000034 1023 0.0000000 17 0.00000128 1023 0.0000000 0.0000000 18 0.00000457 1023 0.0000002 0.000000 0.0000002 19 0.00001514 1023 0.0000006 0.000001 0.0000008 20 0.00004694 1023 0.0000019 0.000003 0.0000027 21 0.00013636 1023 0.0000056 0.000007 0.0000083 22 0.00037190 1023 0.0000154 0.000018 0.0000237 23 0.00095400 1023 0.0000395 0.000044 0.0000632 24 0.00230549 1023 0.0000953 0.000103

75

Page 79: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.0001585 25 0.00525652 1023 0.0002174 0.000228 0.0003759 26 0.01132173 1023 0.0004683 0.000480 0.0008442 27 0.02306279 1023 0.0009539 0.000961 0.0017981 28 0.04447825 1023 0.0018396 0.001834 * 0.0036377 29 0.08128783 1023 0.0033620 0.003329 0.0069997 –1.7415 30 0.14089890 1023 0.0058274 0.005750 0.0128271 –1.5769 31 0.23180142 1023 0.0095871 0.009449 0.0224142 –1.4162 32 0.36218972 1023 0.0149798 0.014777 0.0373940 –1.2579 33 0.53779686 1023 0.0222428 0.021988 0.0596368 –1.1008 34 0.75924263 1023 0.0314015 0.031133 0.0910383 –0.9439 35 1.01955439 1023 0.0421678 0.041944 0.1332061 –0.7866 36 1.30276395 1023 0.0538811 0.053771 0.1870872 –0.6288 37 1.58444264 1023 0.0655310 0.065592 0.2526182 –0.4709 38 1.83461779 1023 0.0758780 0.076134 0.3284962 –0.3136 39 2.02278372 1023 0.0836604 0.084087 0.4121566 –0.1571 40 2.12392290 1023 0.0878434 0.088370 0.5000000 0.0000 41 2.12392290 1023 0.0878434 0.088370 0.5878434 +0.1571 42 2.02278372 1023 0.0836604 0.084087 0.6715038 +0.3136 43 1.83461779 1023 0.0758780 0.076134 0.7473818 +0.4709 44 1.58444264 1023 0.0655310 0.065592 0.8129128 +0.6288 45 1.30276395 1023 0.0538811 0.053771 0.8667939 +0.7866 46 1.01955439 1023 0.0421678 0.041944 0.9089617 +0.9439 47 0.75924263 1023 0.0314015 0.031133

76

Page 80: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.9403632 +1.1008 48 0.53779686 1023 0.0222428 0.021988 0.9626060 +1.2579 49 0.36218972 1023 0.0149798 0.014777 0.9775858 +1.4162 50 0.23180142 1023 0.0095871 0.009449 0.9871729 +1.5769 51 0.14089890 1023 0.0058274 0.005750 0.9930003 +1.7415 52 0.08128783 1023 0.0033620 0.003329 0.9963623 53 0.04447825 1023 0.0018396 0.001834 * 0.9982019 54 0.02306279 1023 0.0009539 0.000961 0.9991558 55 0.01132173 1023 0.0004683 0.000480 0.9996241 56 0.00525652 1023 0.0002174 0.000228 0.9998415 57 0.00230549 1023 0.0000953 0.000103 0.9999368 58 0.00095400 1023 0.0000395 0.000044 0.9999763 59 0.00037190 1023 0.0000154 0.000018 0.9999917 60 0.00013636 1023 0.0000056 0.000007 0.9999973 61 0.00004694 1023 0.0000019 0.000003 0.9999992 62 0.00001514 1023 0.0000006 0.000001 0.9999998 63 0.00000457 1023 0.0000002 0.000000 1.0000000 64 0.00000128 1023 0.0000000 0.000000 1.0000000 65 0.00000034 1023 0.0000000 1.0000000 66 0.00000008 1023 0.000000 1.0000000 67 0.00000002 1023 0.000000 1.0000000 68 0.00000000 1023 0.000000 1.0000000 69 0.00000000 1023 0.000000 1.0000000 C 24.1785164 1023 = 281

77

Page 81: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Note the symmetry in the frequency distribution.

xxFZ 157606.038305.62

1

212erfˆ 1

r = 0.9999962

5000.40,34493.62

22

34493.6

50.40exp08892.0

2exp

π2

1ˆ xxf

ff 00595.1000232.0ˆ r = 0.999985

78

Page 82: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 7: 10 x 10 For this case n = 10 x 10 = 100 and x assumes values of 0, 1, 2, 3, . Corresponding

values of are calculated from Eq. (1). Results are given in Table A23.

100,

xnc , Table A23. Frequency distribution for the square peg board with a 10 x 10 array.

x c f F Z f 0.000000 0 1 0.000000 0.000000 1 100 0.000000 0.000000 0.000000 2 3 4 5 6 20 0.00000001 1029 21 0.00000002 1029 22 0.00000007 1029 23 0.00000025 1029 0.0000000 0.0000000 24 0.00000080 1029 0.0000001 0.0000001 25 0.00000243 1029 0.0000002 0.000000 0.0000003 26 0.00000700 1029 0.0000006 0.000001 0.0000009 27 0.00001917 1029 0.0000015 0.000002 0.0000024 28 0.00004999 1029 0.0000039 0.000005 0.0000063 29 0.00012411 1029 0.0000098 0.000011 0.0000161 30 0.00029372 1029 0.0000232 0.000026

79

Page 83: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.0000393 31 0.00066325 1029 0.0000523 0.000056 0.0000916 32 0.00143012 1029 0.0001128 0.000118 0.0002044 33 0.00294692 1029 0.0002325 0.000239 0.0004369 34 0.00580717 1029 0.0004581 0.000465 0.0008950 35 0.01095067 1029 0.0008639 0.000867 0.0017589 36 0.01977205 1029 0.0015597 0.001553 * 0.0033186 37 0.03420030 1029 0.0026979 0.002673 0.0060165 –1.7815 38 0.05670049 1029 0.0044729 0.004420 0.0104894 –1.6325 39 0.09013924 1029 0.0071107 0.007020 0.0176001 –1.4871 40 0.13746234 1029 0.0108439 0.010708 0.0284440 –1.3440 41 0.20116440 1029 0.0158691 0.015692 0.0443131 –1.2024 42 0.28258809 1029 0.0222923 0.022088 0.0666054 –1.0613 43 0.38116533 1029 0.0300686 0.029866 0.0966740 –0.9202 44 0.49378236 1029 0.0389526 0.038790 0.1356266 –0.7787 45 0.61448471 1029 0.0484743 0.048394 0.1841009 –0.6367 46 0.73470998 1029 0.0579584 0.057996 0.2420593 –0.4946 47 0.84413487 1029 0.0665905 0.066763 0.3086498 –0.3529 48 0.93206559 1029 0.0735270 0.073826 0.3821768 –0.2119 49 0.98913083 1029 0.0780287 0.078417 0.4602055 –0.0711 50 1.00891345 1029 0.0795892 0.080010 0.5397947 +0.0711 51 0.98913083 1029 0.0780287 0.078417 0.6178234 +0.2119 52 0.93206559 1029 0.0735270 0.073826 0.6913504 +0.3529 53 0.84413487 1029 0.0665905 0.066763

80

Page 84: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.7579409 +0.4946 54 0.73470998 1029 0.0579584 0.057996 0.8158993 +0.6367 55 0.61448471 1029 0.0484743 0.048394 0.8643736 +0.7787 56 0.49378236 1029 0.0389526 0.038790 0.9033262 +0.9202 57 0.38116533 1029 0.0300686 0.029866 0.9333948 +1.0613 58 0.28258809 1029 0.0222923 0.022088 0.9556871 +1.2024 59 0.20116440 1029 0.0158691 0.015692 0.9715562 +1.3440 60 0.13746234 1029 0.0108439 0.010708 0.9824001 +1.4871 61 0.09013924 1029 0.0071107 0.007020 0.9895108 +1.6325 62 0.05670049 1029 0.0044729 0.004420 0.9939837 +1.7815 63 0.03420030 1029 0.0026979 0.002673 0.9966816 64 0.01977205 1029 0.0015597 0.001553 * 0.9982413 65 0.01095067 1029 0.0008639 0.000867 0.9991052 66 0.00580717 1029 0.0004581 0.000465 0.9995633 67 0.00294692 1029 0.0002325 0.000239 0.9997958 68 0.00143012 1029 0.0001128 0.000118 0.9999086 69 0.00066325 1029 0.0000523 0.000056 0.9999609 70 0.00029372 1029 0.0000232 0.000026 0.9999841 71 0.00012411 1029 0.0000098 0.000011 0.9999939 72 0.00004999 1029 0.0000039 0.000005 0.9999978 73 0.00001917 1029 0.0000015 0.000002 0.9999993 74 0.00000700 1029 0.0000006 0.000001 0.9999999 75 0.00000243 1029 0.0000002 0.000000 1.0000001 76 0.00000080 1029 0.0000001

81

Page 85: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

1.0000002 77 0.00000025 1029 0.0000000 1.0000002 78 0.00000007 1029 79 0.00000002 1029 80 0.00000001 1029 C 12.67650601 1029 = 2100 Note the symmetry in the frequency distribution.

xxFZ 14181.00907.72

1

212erfˆ 1

r = 0.9999959

000.50,0515.72

22

0515.7

00.50exp080010.0

2exp

π2

1ˆ xxf

ff 00521.1000181.0ˆ r = 0.999989

82

Page 86: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Case 8: 12 x 12 For this case n = 12 x 12 = 144 and x assumes values of 0, 1, 2, 3, . Corresponding

values of are calculated from Eq. (1). Results are given in Table A24.

144,

xnc , Table A24. Frequency distribution for the square peg board with a 12 x 12 array.

x c f F Z f 0.000000 0 1 0.000000 0.000000 1 144 0.000000 0.000000 0.000000 2 3 4 5 6 35 0.00000000 1042 0.000000 36 0.00000001 1042 0.000000 37 0.00000003 1042 0.000000 38 0.00000009 1042 0.00000000 39 0.00000025 1042 0.00000001 0.00000001 40 0.00000066 1042 0.00000003 0.00000004 41 0.00000168 1042 0.00000008 0.00000012 42 0.00000411 1042 0.00000018 0.000000 0.00000030 43 0.00000975 1042 0.00000044 0.000001 0.00000074 44 0.00002237 1042 0.00000100 0.000001 0.00000174 45 0.00004972 1042 0.00000223 0.000003

83

Page 87: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.00000397 46 0.00010700 1042 0.00000480 0.000005 0.00000877 47 0.00022310 1042 0.00001000 0.000011 0.00001877 48 0.00045085 1042 0.00002022 0.000022 0.00003899 49 0.00088331 1042 0.00003961 0.000042 0.00007860 50 0.00167828 1042 0.00007526 0.000079 0.00015386 51 0.00309331 1042 0.00013871 0.000143 0.00029257 52 0.00553226 1042 0.00024808 0.000254 0.00054065 53 0.00960317 1042 0.00043062 0.000436 0.00097127 54 0.01618312 1042 0.00072568 0.000731 0.00169695 55 0.02648146 1042 0.00118747 0.001190 * 0.00288442 56 0.04208661 1042 0.00188723 0.001884 0.00477165 57 0.06497582 1042 0.00291362 0.002901 0.00768527 –1.7166 58 0.09746373 1042 0.00437042 0.004344 0.01205569 –1.5942 59 0.14206578 1042 0.00637045 0.006327 0.01842614 –1.4738 60 0.20125985 1042 0.00902480 0.008962 0.02745094 –1.3550 61 0.27714472 1042 0.01242760 0.012346 0.03987854 –1.2371 62 0.37101632 1042 0.01663695 0.016540 0.05651549 –1.1197 63 0.48291012 1042 0.02165444 0.021550 0.07816993 –1.0023 64 0.61118313 1042 0.02740640 0.027306 0.10557633 –0.8848 65 0.75222539 1042 0.03373095 0.033650 0.13930728 –0.7668 66 0.90039099 1042 0.04037493 0.040329 0.17968221 –0.6486 67 1.04821638 1042 0.04700365 0.047005 0.22668586 –0.5302 68 1.18695090 1042 0.05322472 0.053282

84

Page 88: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.27991058 –0.4119 69 1.30736621 1042 0.05862433 0.058738 0.33853491 –0.2941 70 1.40074951 1042 0.06281178 0.062973 0.40134669 –0.1767 71 1.45993611 1042 0.06546580 0.065660 0.46681249 –0.0593 72 1.48021300 1042 0.06637505 0.066581 0.53318754 +0.0593 73 1.45993611 1042 0.06546580 0.065660 0.59865334 +0.1767 74 1.40074951 1042 0.06281178 0.062973 0.66146512 +0.2941 75 1.30736621 1042 0.05862433 0.058738 0.72008945 +0.4119 76 1.18695090 1042 0.05322472 0.053282 0.77331417 +0.5302 77 1.04821638 1042 0.04700365 0.047005 0.82031782 +0.6486 78 0.90039099 1042 0.04037493 0.040329 0.86069275 +0.7668 79 0.75222539 1042 0.03373095 0.033650 0.89442370 +0.8848 80 0.61118313 1042 0.02740640 0.027306 0.92183010 +1.0023 81 0.48291012 1042 0.02165444 0.021550 0.94348454 +1.1197 82 0.37101632 1042 0.01663695 0.016540 0.96012149 +1.2371 83 0.27714472 1042 0.01242760 0.012346 0.97254909 +1.3550 84 0.20125985 1042 0.00902480 0.008962 0.98157389 +1.4738 85 0.14206578 1042 0.00637045 0.006327 0.98794434 +1.5942 86 0.09746373 1042 0.00437042 0.004344 0.99231476 +1.7166 87 0.06497582 1042 0.00291362 0.002901 0.99522838 88 0.04208661 1042 0.00188723 0.001884 0.99711561 89 0.02648146 1042 0.00118747 0.001190 * 0.99830308 90 0.01618312 1042 0.00072568 0.000731 0.99902876 91 0.00960317 1042 0.00043062 0.000436

85

Page 89: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

0.99945938 92 0.00553226 1042 0.00024808 0.000254 0.99970746 93 0.00309331 1042 0.00013871 0.000143 0.99984617 94 0.00167828 1042 0.00007526 0.000079 0.99992143 95 0.00088331 1042 0.00003961 0.000042 0.99996104 96 0.00045085 1042 0.00002022 0.000022 0.99998126 97 0.00022310 1042 0.00001000 0.000011 0.99999126 98 0.00010700 1042 0.00000480 0.000005 0.99999606 99 0.00004972 1042 0.00000223 0.000003 0.99999829 100 0.00002237 1042 0.00000100 0.000001 0.99999929 101 0.00000975 1042 0.00000044 0.000001 0.99999973 102 0.00000411 1042 0.00000018 0.000000 0.99999991 103 0.00000168 1042 0.00000008 0.99999999 104 0.00000066 1042 0.00000003 1.00000002 105 0.00000025 1042 0.00000001 1.00000003 106 0.00000009 1042 0.00000000 107 0.00000003 1042 0.000000 108 0.00000001 1042 0.000000 109 0.00000000 1042 0.000000 143 144 144 1 C 22.30074520 1042 = 2144 ok Note the symmetry in the frequency distribution.

86

Page 90: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

xxFZ 11801.049688.82

1

212erfˆ 1

r = 0.9999985

00.72,47377.82

22

47377.8

00.72exp066581.0

2exp

π2

1ˆ xxf

ff 00290.10000840.0ˆ r = 0.9999960 Note that the total number of combinations is controlled by 2n. For n = 500 cells, this gives C = 3.2733906 10150 total combinations !!! Is this a large number ? Compared to what ? Compared to molecules in a container of gas it may not be so big !! Remember Avogadro’s number from chemistry (6.0 1023) ? Or to the number of microorganisms in a living body. Statistics entered the field of physics when James Clerk Maxwell used it in his kinetic theory of gases in the mid 1800s, and was then developed further by Ludwig Boltzmann and Willard Gibbs. This led to the branch of physics known as statistical mechanics. These concepts were later incorporated in chemical kinetics by Henry Eyring in the absolute reaction rate theory.

87

Page 91: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table A25. Spread of the distributions for peg boards with different number of holes (n).

Board n 2

2 2n

Linear 2 0.9534 1.00 1.0489 0.9534 5 1.6286 2.50 1.5351 1.0300 6 1.8003 3.00 1.6664 1.0394 7 1.8892 3.50 1.8526 1.0098 8 2.0312 4.00 1.9692 1.0156 Triangular 3 1.2213 1.50 1.2282 0.9972 6 1.8003 3.00 1.6664 1.0394 10 2.2939 5.00 2.1797 1.0259 15 2.7711 7.50 2.7065 1.0119 21 3.2707 10.50 3.2103 1.0094 28 3.7810 14.00 3.7027 1.0105 Square 4 1.4391 2.00 1.3898 1.0176 9 2.1653 4.50 2.0782 1.0207 16 2.8703 8.00 2.7872 1.0148 25 3.5740 12.50 3.4975 1.0109 36 4.2786 18.00 4.2070 1.0085 49 4.9687 24.50 4.93087 1.0038 64 5.67576 32.00 5.63801 1.0033 81 6.38305 40.50 6.34493 1.0030 100 7.0907 50.00 7.05149 1.0028 144 8.49672 72.00 8.473858 1.00135

Note: 12

n

or n2 spread of distribution is approaching n

2

n center of distribution is equal to n/2

References Abramowitz, M. and I.A. Stegun. 1965. Handbook of Mathematical Functions. Dover

Publications. New York, NY. Eigen, M. and R. Winkler. 1993. Laws of the Game: How the Principles of Nature Govern

Chance. Princeton University Press. Princeton, NJ. Polster, B. 2004. Q.E.D.: Beauty in Mathematical Proof. Walker & Co. New York, NY. Ruhla, C. 1992. The Physics of Chance: From Blaise Pascal to Niels Bohr. Oxford University

Press. New York, NY. Watkins, M. 2000. Useful Mathematical and Physical Formulae. Walker & Co. New York, NY.

88

Page 92: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table A26. Correlation between discrete (f) and continuous Gaussian ( ) distributions. f Pegboard Cells Regression Equation Correlation

Linear 1 x 2 1 ff 2848.11045.0ˆ

1 x 3 1.0000 ff 1404.103885.0ˆ

1 x 4 0.9971 ff 1052.10226.0ˆ

1 x 5 0.99723 ff 0882.101543.0ˆ

1 x 6 0.99730 ff 0809.101187.0ˆ

1 x 7 0.999198 ff 0689.101188.0ˆ

1 x 8 0.999084 ff 0589.1008700.0ˆ

Triangular n = 3 1.000000 ff 1432.103950.0ˆ

n = 6 0.99730 ff 0809.101187.0ˆ

n = 10 0.99894 ff 0383.100352.0ˆ

n = 15 0.999637 ff 02423.1002046.0ˆ

n = 21 0.99980 ff 0183.100135.0ˆ

n = 28 0.99985 ff 01674.1000995.0ˆ

Square 2 x 2 0.9971 ff 1053.102262.0ˆ

3 x 3 0.99895 ff 0369.100373.0ˆ

4 x 4 099962 ff 0248.100194.0ˆ

5 x 5 0.999827 ff 017789.1001129.0ˆ

6 x 6 0.999909 ff 01356.1000723.0ˆ

7 x 7 0.999965 ff 00727.1000345.0ˆ

8 x 8 0.999978 ff 00622.1000252.0ˆ

9 x 9 0.999985 ff 00595.1000232.0ˆ

10 x 10 0.999989 ff 00521.1000181.0ˆ

12 x 12 0.9999960 ff 00290.10000840.0ˆ As number of cells increases, the intercept approaches 0 and slope approaches 1, which means that fit of the continuous Gaussian distribution to discrete distribution improves as the number of cells increases.

89

Page 93: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

ABE 6933 Special Topics Mathematical and Statistical Characteristics of Nonlinear Regression Models

A.R. Overman

I. Elements of Probability and Calculus A. Arithmetic – the process of counting B. Natural numbers – positive integers ( ,2,1,0 ) C. Rational numbers – ratio of two integers ( ,3/2,3/1,2/1,,1/2,1/1 )

D. Irrational numbers (such as , e, π 2 , etc.)

E. Complex numbers – z = x + i y with i = 1 F. Binomial theorem and Pascal’s triangle (a + b)0 = 1 (a + b)1 = a1 + b1 (a + b)2 = a2 + 2ab + b2 (a + b)3 = a3 + 3a2b + 3ab2 + b3 (a + b)4 = a4 + 4a3b + 6a2b2 + 4ab3 + b4 (a + b)5 = a5 + 5a4b + 10a3b2 + 10a2b3 + 5ab4 + b5 (a + b)6 = a6 + 6a5b + 15a4b2 + 20a3b3 + 15a2b4 + 6ab5 + b6 (a + b)7 = a7 + 7a6b + 21a5b2 + 35a4b3 + 35a3b4 + 21a2b5 + 7ab6 + b7 (a + b)8 = a8 + 8a7b + 28a6b2 + 56a5b3 + 70a4b4 + 56a3b5 + 28a2b6 + 8ab7 + b8 (a + b)9 = a9 + 9a8b + 36a7b2 + 84a6b3 + 126a5b4 + 126a4b5 + 84a3b6 + 36a2b7 + 9ab8 + b9 Note symmetry in the distribution of coefficients for each expansion.

90

Page 94: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Pascal’s triangle for binomial coefficients 1

1 2 1

1 3 3 1

1 4 6 4 1

1 5 10 10 5 1

1 6 15 20 15 6 1

1 7 21 35 35 21 7 1

1 8 28 56 70 56 28 8 1

1 9 36 84 126 126 84 36 9 1

1 10 45 120 210 252 210 120 45 10 1

1 11 55 165 330 462 462 330 165 55 11 1

1 12 66 220 495 792 924 792 495 220 66 12 1

Note the pattern in the coefficients, including symmetry. G. Frequency distributions 1. Discrete distribution Consider the problem of a peg board. This is a two state system – a cell (hole) is either filled or empty. Each cell holds one and only one object (peg), which can be viewed as a type of exclusion principle. Define n as the total number of cells and x as the number of filled cells (pegs). Cells (holes) are indistinguishable (all alike), as are the objects (pegs). Order of filling the cells is irrelevant. Note that a peg board can be linear, triangular, rectangular (Eigen and Winkler, 1993, p. 40; Polster, 2004, p. 33), or even 3-dimensional. The number of distinguishable combinations which are possible for each x, xnc , , can be calculated from (Ruhla, 1992, p. 18; Watkins, 2000, p. 22)

)!(!

!,

xnx

nxnc

(1)

91

Page 95: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

where and is called ‘n factorial’. Note that n can assume positive integers nn 321! ( ) and x can also assume positive integers (,3,2,1n nx ,,2,1,0 ). For small values of n it is easy to estimate c by intuition, but larger n calculations of c are best performed on a pocket calculator or computer with the algorithm for computations (Eq. (1)) built in. The total number of combinations C for the system is defined as the sum of c values for all values of x, and can be calculated from C = 2n. The frequency distribution of c values is then calculated from f = c/C. Cumulative frequency is calculated from the cumulative sum

fF (2)

so that F is normalized . It should be noted that F forms a discrete set of numbers for a particular case.

10 F

2. Continuous distribution The next step is to compare the discrete distribution to a continuous Gaussian distribution where x is considered a continuous variable and the cumulative distribution is described by

2erf1

2

1 xF (3)

where and are the mean and spread of the distribution. The ‘error function’ is defined by

2

0

2 )exp(π

2

2erf

x

duux

(4)

where represents the Gaussian distribution (bell-shaped curve). Values of the erf can be obtained from mathematical tables (cf. Abramowitz and Stegun, 1965, chp. 7). Some properties of the error function should be noted:

)exp( 2u

erf (0) = 0, erf ( ) = 1, erf (–x) = – erf (+x), erf ( ) = – 1 Equation (3) can be rearranged to the linear form

xFZ

2

1

212erf 1 (5)

where erf–1 is the inverse error function. For example, . Linear regression of Z vs. x leads to values of the parameters

00.1)8427.0(erf 1

and . With these parameters now known the frequency distribution for f vs. x can be calculated for the continuous distribution from

92

Page 96: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

2

2exp

π2

x

f (6)

The procedure can now be applied to a linear peg board, triangular peg board, and square peg board. It can even be applied to a 3–dimensional system. This analysis falls within a branch of mathematics known as group theory. Values of the error function can be calculated from the series approximation (Abramowitz and Stegun, 1965, p. 299)

4432 ]078108.0000972.0230389.0278393.01[

11erf

xxxxx

(7)

for . For the case where erf x is given, the inverse erf –1 and therefore x can be obtained on a scientific calculator or computer using the solver routine. Note that for the case F < 0.5 and 2F–1 < 0 (negative) the procedure is to change the value from – to +, solve for the inverse by Eq. (7) and change the sign from +x to –x. Equation (7) does not work directly for –x because the power series in Eq. (7) is not symmetric.

8.10 x

H. Symmetry and conservation principle In all of the discrete and continuous Gaussian distributions we note symmetry in the distributions around a mean point. A mathematical consequence of this property is that something is conserved (remains constant) in the system. Note that the number of filled cells is defined by x. Since this is a two-state (binary) system (cells are either empty of filled), it follows that the number of unfilled cells is n – x. The total capacity of the system is the sum of filled and unfilled cells so that total capacity is = x + n – x = n. While this is obvious for our case, it illustrates the connection between symmetry and conservation. This property turns out to be very important in the various models of physics (including mechanics, electromagnetism, relativity, and quantum mechanics). It also shows up in chemistry and biology.

93

Page 97: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Gaussian Distribution The Gauss differential equation is given by

kxydx

dy with y = A at x = 0 (1)

where y ) is a continuous function of x ( )0( y x , and k is the distribution coefficient. 1. Obtain the integral solution to Eq. (1). 2. Sketch the form of the solution y vs. x on linear graph paper. 3. Perform the 2nd derivative of y on x to obtain the inflection points at x . Write the constant k in terms of . 4. Evaluate the constant A by normalizing the integral

(2)

1ydx

5. Write the resulting solution y in terms of variable x and parameter . 6. Obtain the cumulative probability distribution F

2

2erf1

2

1x xydxF (3)

where the ‘error function’ is defined by

z

duuz0

2expπ

2erf (4)

7. Calculate and plot F vs. 2/x on linear graph paper.

94

Page 98: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Solutions Note: at x = 0, dy/dx = 0 (maximum or minimum)

1.

22

2

1exp

2

1kxAydxkdxkx

y

dy Solution is symmetric around x = 0

3. 2

222

2 10

2

1exp1

kkxkxkA

dx

yd

2

2

2exp

x

Ay

at x = 0, d2y/dx2 < 0 (maximum)

4.

π2

11π2

exp222

exp22

exp 2

2

2

2

AA

duuAx

dx

Adxx

Aydx

A is chosen so that the distribution is normalized, hence the term ‘normal distribution’.

5.

2

2exp

π2

1

x

y

6.

2erf1

2

1exp

π

21

2

1exp

2

π

π

1

expexpπ

1exp

π

1

20

220

2

2 20

20

222

xduuduu

duuduuduuydxF

xx

x x x

where

2

0

2expπ

2

2erf

x

duux

This ties the cumulative frequency distribution to the error function of mathematical physics. Note the characteristics of the error function: erf (0) = 0, erf ( ) = 1, erf (–x) = –erf (+x). It follows that F is bounded by 10 F . Note also that F is a well-behaved function.

95

Page 99: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Solve Eq. (1) by the power series method.

kxydx

dy with y = A at x = 0 (1)

Assume that the solution is given by a power series

77

66

55

44

33

2210 xaxaxaxaxaxaxaay (2)

The first derivative is given by

67

56

45

34

2321 765432 xaxaxaxaxaxaa

dx

dy (3)

Substitution of Eqs. (2) and (3) into Eq. (1) leads to

76

65

54

43

32

210

66

55

44

33

2210

67

56

45

34

2321 765432

xkaxkaxkaxkaxkaxkaxka

xaxaxaxaxaxaakx

xaxaxaxaxaxaa

(4)

Equating like coefficients in Eq. (4) gives the recursion relations

6426

05

424

03

2

0

03

46

35

02

24

13

02

1

akkaa

kaa

akkaa

kaa

kaa

a

It follows that the solution is given by

96

Page 100: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

63

42

20

603

402

200

66

55

44

33

2210

6424221

6420

420

20

xk

xk

xk

a

xak

xak

xka

a

xaxaxaxaxaxaay

(5)

Now use the substitution

2

2x

k (6)

Then Eq. (5) becomes

2

00

32

0

32

0 2expexp

!3!21

321211 x

kaaaay (7)

The constant is evaluated from the boundary condition, which leads to 0a

2

2exp x

kAy (8)

This is the famous Gaussian distribution centered at x = 0. It remains to determine k in terms of the variance of the distribution and A to normalize the distribution. Check:

kxyxk

kxAdx

dy

2

2exp correct (9)

97

Page 101: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Probability distributions with dice. Consider a single die with 6 faces numbered 1 through 6. Assume that each number is equally likely. The frequency distribution can now be calculated. Table A27. Frequency distribution for a single die

x S c f F Z Z f ˆ

0.00000000 1 1 1 0.16666667 0.093758 0.16666667 –0.684656 –0.669385 2 2 1 0.16666667 0.146760 0.33333334 –0.304151 –0.334693 3 3 1 0.16666667 0.183615 0.50000001 0.000000 0.000000 4 4 1 0.16666667 0.183615 0.66666668 +0.304151 +0.334693 5 5 1 0.16666667 0.146760 0.83333335 +0.684656 +0.669385 6 6 1 0.16666667 0.093758 1.00000002 C 6

SSFZ 334693.0171424.12

1

212erfˆ 1

r = 0.998961

5000.3,9878.22

22

9878.2

5000.3exp18883.0

2exp

π2

1ˆ SSf

No correlation between and f , since f = constant = 1/6. f

98

Page 102: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table A28. Frequency distribution for two dice

x S c f F Z Z f c 0.00000000 1,1 2 1 0.02777778 0.020242 0.73 0.02777778 –1.3513 –1.2999 1,2; 2,1 3 2 0.05555556 0.042894 1.54 0.08333334 –0.9781 –1.0110 1,3; 2,2; 3,1 4 3 0.08333333 0.076922 2.77 0.16666667 –0.6847 –0.7222 1,4; 2,3; 3,2; 4,1 5 4 0.11111111 0.116745 4.20 0.27777778 –0.4164 –0.4333 1,5; 2;4; 3,3; 4,2; 5,1 6 5 0.13888889 0.149951 5.40 0.41666667 –0.1490 –0.1444 1,6; 2,5; 3,4; 4,3; 5,2; 6,1 7 6 0.16666667 0.163000 5.87 0.58333334 +0.1490 +0.1444 2,6; 3,5; 4,4; 5,3; 6,2 8 5 0.13888889 0.149951 5.40 0.72222223 +0.4164 +0.4333 3,6; 4,5; 5,4; 6,3 9 4 0.11111111 0.116745 4.20 0.83333334 +0.6847 +0.7222 4,6; 5,5; 6,4 10 3 0.08333333 0.076922 2.77 0.91666667 +0.9781 +1.0110 5,6; 6,5 11 2 0.05555556 0.042894 1.54 0.97222223 +1.3513 +1.2999 6,6 12 1 0.02777778 0.020242 0.73 1.00000001 C 36 = 62

SSFZ 28885.002195.22

1

212erfˆ 1

r = 0.999211

0000.7,4620.32

22

4620.3

0000.7exp1630.0

2exp

π2

1ˆ SSf

ff 1360.10145.0ˆ r = 0.99319

fc ˆ36ˆ The two dice problem is discussed by Speyer (1994, p. 62)

99

Page 103: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table A29. Frequency distribution for three dice

x S c f F Z Z f ˆ

0.000000 1,1,1 3 1 0.004630 0.005418 0.004630 1,1,2; 1,2,1; 2,1,1 4 3 0.013889 0.012058 0.018519 –1.4726 –1.4344 1,1,3; 1,2,2; 1,3,1; 2,1,2; 5 6 0.027778 0.023940 2,2,1; 3,1,1 0.046297 –1.1878 –1.1953 1,1,4; 1,2,3; 1,3,2; 1,4,1; 6 10 0.046296 0.042397 2,1,3; 2,2,2; 2,3,1; 3,1,2; 0.092593 –0.9372 –0.9563 3,2,1; 4,1,1 1,1,5; 1,2,4; 1,3,3; 1,4,2; 7 15 0.069444 0.066973 1,5,1; 2,1,4; 2,2,3; 2,3,2; 0.162037 –0.6979 –0.7172 2,4,1; 3,1,3; 3,2,2; 3,3,1; 4,1,2; 4,2,1; 5,1,1 1,1,6; 1,2,5; 1,3,4; 1,4,3; 8 21 0.097222 0.094367 1,5,2; 1,6,1; 2,1,5; 2,2,4; 0.259259 –0.4563 –0.4781 2,3,3; 2,4,2; 2,5,1; 3,1,4; 3,2,3; 3,3,2; 3,4,1; 4,1,3; 4,2,2; 4,3,1; 5,1,2; 5,2,1; 6,1,1 1,2,6; 1,3,5; 1,4,4; 1,5,3; 9 25 0.115741 0.118605 1,6,2; 2,1,6; 2,2,5; 2,3,4; 0.375000 –0.2251 –0.2391 2,4,3; 2,5,2; 2,6,1; 3,1,5; 3,2,4; 3,3,3; 3,4,2; 3,5,1; 4,1,4; 4,2,3; 4,3,2; 4,4,1; 5,1,3; 5,2,2; 5,3,1; 6,1,2; 6,2,1 1,3,6; 1,4,5; 1,5,4; 1,6,3; 10 27 0.125000 0.132967 2,2,6; 2,3,5; 2,4,4; 2,5,3; 0.500000 0.0000 0.0000 2,6,2; 3,1,6; 3,2,5; 3,3,4; 3,4,3; 3,5,2; 3,6,1; 4,1,5; 4,2,4; 4,3,3; 4,4,2; 4,5,1; 5,1,4; 5,2,3; 5,3,2; 5,4,1; 6,1,3; 6,2,2; 6,3,1 1,4,6; 1,5,5; 1,6,4; 2,3,6; 11 27 0.125000 0.132967 2,4,5; 2,5,4; 2,6,3; 3,2,6; 0.625000 +0.2251 +0.2391 3,3,5; 3,4,4; 3,5,3; 3,6,2;

100

Page 104: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

4,1,6; 4,2,5; 4,3,4; 4,4,3; 4,5,2; 4,6,1; 5,1,5; 5,2,4; 5,3,3; 5,4,2; 5,5,1; 6,1,4; 6,2,3; 6,3,2; 6,4,1 1,5,6; 1,6,5; 2,4,6; 2,5,5; 12 25 0.115741 0.118605 2,6,4; 3,3,6; 3,4,5; 3,5,4; 0.740741 +0.4563 +0.4781 3,6,3; 4,2,6; 4,3,5; 4,4,4; 4,5,3; 4,6,2; 5,1,6; 5,2,5; 5,3,4; 5,4,3; 5,5,2; 5,6,1; 6,1,5; 6,2,4; 6,3,3; 6,4,2; 6,5,1 1,6,6; 2,5,6; 2,6,5; 3,4,6; 13 21 0.097222 0.094367 3,5,5; 3,6,4; 4,3,6; 4,4,5; 0.837963 +0.6979 +0.7172 4,5,4; 4,6,3; 5,2,6; 5,3,5; 5,4,4; 5,5,3; 5,6,2; 6,1,6; 6,2,5; 6,3,4; 6,4,3; 6,5,2; 6,6,1 2,6,6; 3,5,6; 3,6,5; 4,4,6; 14 15 0.069444 0.066973 4,5,5; 4,6,4; 5,3,6; 5,4,5; 0.907407 +0.9372 +0.9563 5,5,4; 5,6,3; 6,2,6; 6,3,5; 6,4,4; 6,5,3; 6,6,2 3,6,6; 4,5,6; 4,6,5; 5,4,6; 15 10 0.046296 0.042397 5,5,5; 5,6,4; 6,3,6; 6,4,5; 0.953703 +1.1878 +1.1953 6,5,4; 6,6,3; 4,6,6; 5,5,6; 5,6,5; 6,4,6; 16 6 0.027778 0.023940 6,5,5; 6,6,4 0.981481 +1.4726 +1.4344 5,6,6; 6,5,6; 6,6,5 17 3 0.013889 0.012058 0.995370 6,6,6 18 1 0.004630 0.005418 1.000000 C 216 = 63

SSFZ 23906.05102.22

1

212erfˆ 1

r = 0.999719

5001.10,1830.42

101

Page 105: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

22

1830.4

500.10exp13488.0

2exp

π2

1ˆ SSf

ff 0752.100590.0ˆ r = 0.99800 For three dice the discrete distribution is closely approximated by the continuous Gaussian distribution. I conclude that the peg board is a much simpler illustration of frequency distributions than dice.

102

Page 106: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table A30. Frequency distribution for four dice x S c f 1,1,1,1 4 1 0.000772 1,1,1,2; 1,1,2,1; 1,2,1,1; 5 4 0.003086 2,1,1,1 . 1,1,1,3; 1,1,2,2; 1,1,3,1; 6 10 0.007716 1,2,1,2; 1,2,2,1; 1,3,1,1; 2,1,1,2; 2,1,2,1; 2,2,1,1; 3,1,1,1 1,1,1,4; 1,1,2,3; 1,1,3,2; 7 20 0.015432 1,1,4,1; 1,2,1,3; 1,2,2,2; 1,2,3,1; 1,3,1,2; 1,3,2,1; 1,4,1,1; 2,1,1,3; 2,1,2,2; 2,1,3,1; 2,2,1,2; 2,2,2,1; 2,3,1,1; 3,1,1,2; 3,1,2,1; 3,2,1,1; 4,1,1,1 1,1,1,5; 1,1,2,4; 1,1,3,3; 8 35 0.027006 1,1,4,2; 1,1,5,1; 1,2,1,4; 1,2,2,3; 1,2,3,2; 1,2,4,1; 1,3,1,3; 1,3,2,2; 1,3,3,1; 1,4,1,2; 1,4,2,1; 1,5,1,1; 2,1,1,4; 2,1,2,3; 2,1,3,2; 2,1,4,1; 2,2,1,3; 2,2,2,2; 2,2,3,1; 2,3,1,2; 2,3,2,1; 2,4,1,1; 3,1,1,3; 3,1,2,2; 3,1,3,1; 3,2,1,2; 3,2,2,1; 3,3,1,1; 4,1,1,2; 4,1,2,1; 4,2,1,1; 5,1,1,1 1,1,1,6; 1,1,2,5; 1,1,3,4; 9 56 0.043210 1,1,4,3; 1,1,5,2; 1,1,6,1; 1,2,1,5; 1,2,2,4; 1,2,3,3; 1,2,4,2; 1,2,5,1; 1,3,1,4; 1,3,2,3; 1,3,3,2; 1,3,4,1; 1,4,1,3; 1,4,2,2; 1,4,3,1; 1,5,1,2; 1,5,2,1; 1,6,1,1; 2,1,1,5; 2,1,2,4; 2,1,3,3; 2,1,4,2; 2,1,5,1; 2,2,1,4; 2,2,2,3; 2,2,3,2; 2,2,4,1; 2,3,1,3; 2,3,2,2; 2,3,3,1;

103

Page 107: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

2,4,1,2; 2,4,2,1; 2,5,1,1; 3,1,1,4; 3,1,2,3; 3,1,3,2; 3,1,4,1; 3,2,1,3; 3,2,2,2; 3,2,3,1; 3,3,1,2; 3,3,2,1; 3,4,1,1; 4,1,1,3; 4,1,2,2; 4,1,3,1; 4,2,1,2; 4,2,2,1; 4,3,1,1; 5,1,1,2; 5,1,2,1; 5,2,1,1; 6,1,1,1 1,1,2,6; 1,1,3,5; 1,1,4,4; 10 80 0.061728 1,1,5,3; 1,1,6,2; 1,2,1,6; 1,2,2,5; 1,2,3,4; 1,2,4,3; 1,2,5,2; 1,2,6,1; 1,3,1,5; 1,3,2,4; 1,3,3,3; 1,3,4,2; 1,3,5,1; 1,4,1,4; 1,4,2,3; 1,4,3,2; 1,4,4,1; 1,5,1,3; 1,5,2,2; 1,5,3,1; 1,6,1,2; 1,6,2,1; 2,1,1,6; 2,1,2,5; 2,1,3,4; 2,1,4,3; 2,1,5,2; 2,1,6,1; 2,2,1,5; 2,2,2,4; 2,2,3,3; 2,2,4,2; 2,2,5,1; 2,3,1,4; 2,3,2,3; 2,3,3,2; 2,3,4,1; 2,4,1,3; 2,4,2,2; 2,4,3,1; 2,5,1,2; 2,5,2,1; 2,6,1,1; 3,1,1,5; 3,1,2,4; 3,1,3,3; 3,1,4,2; 3,1,5,1; 3,2,1,4; 3,2,2,3; 3,2,3,2; 3,2,4,1; 3,3,1,3; 3,3,2,2; 3,3,3,1; 3,4,1,2; 3,4,2,1; 3,5,1,1; 4,1,1,4; 4,1,2,3; 4,1,3,2; 4,1,4,1; 4,2,1,3; 4,2,2,2; 4,2,3,1; 4,3,1,2; 4,3,2,1; 4,4,1,1; 5,1,1,3; 5,1,2,2; 5,1,3,1; 5,2,1,2; 5,2,2,1; 5,3,1,1; 6,1,1,2; 6,1,2,1; 6,2,1,1 1,1,3,6; 1,1,4,5; 1,1,5,4; 11 104 0.080247 1,1,6,3; 1,2,2,6; 1,2,3,5; 1,2,4,4; 1,2,5,3; 1,2,6,2; 1,3,1,6; 1,3,2,5; 1,3,3,4; 1,3,4,3; 1,3,5,2; 1,3,6,1; 1,4,1,5; 1,4,2,4; 1,4,3,3; 1,4,4,2; 1,4,5,1; 1,5,1,4; 1,5,2,3; 1,5,3,2; 1,5,4,1; 1,6,1,3; 1,6,2,2; 1,6,3,1;

104

Page 108: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

2,1,2,6; 2,1,3,5; 2,1,4,4; 2,1,5,3; 2,1,6,2; 2,2,1,6; 2,2,2,5; 2,2,3,4; 2,2,4,3; 2,2,5,2; 2,2,6,1; 2,3,1,5; 2,3,2,4; 2,3,3,3; 2,3,4,2; 2,3,5,1; 2,4,1,4; 2,4,2,3; 2,4,3,2; 2,4,4,1; 2,5,1,3; 2,5,2,2; 2,5,3,1; 2,6,1,2; 2,6,2,1; 3,1,1,6; 3,1,2,5; 3,1,3,4; 3,1,4,3; 3,1,5,2; 3,1,6,1; 3,2,1,5; 3,2,2,4; 3,2,3,3; 3,2,4,2; 3,2,5,1; 3,3,1,4; 3,3,2,3; 3,3,3,2; 3,3,4,1; 3,4,1,3; 3,4,2,2; 3,4,3,1; 3,5,1,2; 3,5,2,1; 3,6,1,1; 4,1,1,5; 4,1,2,4; 4,1,3,3; 4,1,4,2; 4,1,5,1; 4,2,1,4; 4,2,2,3; 4,2,3,2; 4,2,4,1; 4,3,1,3; 4,3,2,2; 4,3,3,1; 4,4,1,2; 4,4,2,1; 4,5,1,1; 5,1,1,4; 5,1,2,3; 5,1,3,2; 5,1,4,1; 5,2,1,3; 5,2,2,2; 5,2,3,1; 5,3,1,2; 5,3,2,1; 5,4,1,1; 6,1,1,3; 6,1,2,2; 6,1,3,1; 6,2,1,2; 6,2,2,1; 6,3,1,1 1,1,4,6; 1,1,5,5; 1,1,6,4; 12 125 0.096451 1,2,3,6; 1,2,4,5; 1,2,5,4; 1,2,6,3; 1,3,2,6; 1,3,3,5; 1,3,4,4; 1,3,5,3; 1,3,6,2; 1,4,1,6; 1,4,2,5; 1,4,3,4; 1,4,4,3; 1,4,5,2; 1,4,6,1; 1,5,1,5; 1,5,2,4; 1,5,3,3; 1,5,4,2; 1,5,5,1; 1,6,1,4; 1,6,2,3; 1,6,3,2; 1,6,4,1; 2,1,3,6; 2,1,4,5; 2,1,5,4; 2,1,6,3; 2,2,2,6; 2,2,3,5; 2,2,4,4; 2,2,5,3; 2,2,6,2; 2,3,1,6; 2,3,2,5; 2,3,3,4; 2,3,4,3; 2,3,5,2; 2,3,6,1; 2,4,1,5; 2,4,2,4; 2,4,3,3; 2,4,4,2; 2,4,5,1; 2,5,1,4; 2,5,2,3; 2,5,3,2; 2,5,4,1; 2,6,1,3; 2,6,2,2; 2,6,3,1; 3,1,2,6; 3,1,3,5; 3,1,4,4;

105

Page 109: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

3,1,5,3; 3,1,6,2; 3,2,1,6; 3,2,2,5; 3,2,3,4; 3,2,4,3; 3,2,5,2; 3,2,6,1; 3,3,1,5; 3,3,2,4; 3,3,3,3; 3,3,4,2; 3,3,5,1; 3,4,1,4; 3,4,2,3; 3,4,3,2; 3,4,4,1; 3,5,1,3; 3,5,2,2; 3,5,3,1; 3,6,1,2; 3,6,2,1; 4,1,1,6; 4,1,2,5; 4,1,3,4; 4,1,4,3; 4,1,5,2; 4,1,6,1; 4,2,1,5; 4,2,2,4; 4,2,3,3; 4,2,4,2; 4,2,5,1; 4,3,1,4; 4,3,2,3; 4,3,3,2; 4,3,4,1; 4,4,1,3; 4,4,2,2; 4,4,3,1; 4,5,1,2; 4,5,2,1; 4,6,1,1; 5,1,1,5; 5,1,2,4; 5,1,3,3; 5,1,4,2; 5,1,5,1; 5,2,1,4; 5,2,2,3; 5,2,3,2; 5,2,4,1; 5,3,1,3; 5,3,2,2; 5,3,3,1; 5,4,1,2; 5,4,2,1; 5,5,1,1; 6,1,1,4; 6,1,2,3; 6,1,3,2; 6,1,4,1; 6,2,1,3; 6,2,2,2; 6,2,3,1; 6,3,1,2; 6,3,2,1; 6,4,1,1 1,1,5,6; 1,1,6,5; 1,2,4,6; 13 140 0.108025 1,2,5,5; 1,2,6,4; 1,3,3,6; 1,3,4,5; 1,3,5,4; 1,3,6,3; 1,4,2,6; 1,4,3,5; 1,4,4,4; 1,4,5,3; 1,4,6,2; 1,5,1,6; 1,5,2,5; 1,5,3,4; 1,5,4,3; 1,5,5,2; 1,5,6,1; 1,6,1,5; 1,6,2,4; 1,6,3,3; 1,6,4,2; 1,6,5,1; 2,1,4,6; 2,1,5,5; 2,1,6,4; 2,2,3,6; 2,2,4,5; 2,2,5,4; 2,2,6,3; 2,3,2,6; 2,3,3,5; 2,3,4,4; 2,3,5,3; 2,3,6,2; 2,4,1,6; 2,4,2,5; 2,4,3,4; 2,4,4,3; 2,4,5,2; 2,4,6,1; 2,5,1,5; 2,5,2,4; 2,5,3,3; 2,5,4,2; 2,5,5,1; 2,6,1,4; 2,6,2,3; 2,6,3,2; 2,6,4,1; 3,1,3,6; 3,1,4,5; 3,1,5,4; 3,1,6,3; 3,2,2,6; 3,2,3,5; 3,2,4,4; 3,2,5,3; 3,2,6,2; 3,3,1,6; 3,3,2,5; 3,3,3,4; 3,3,4,3; 3,3,5,2;

106

Page 110: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

3,3,6,1; 3,4,1,5; 3,4,2,4; 3,4,3,3; 3,4,4,2; 3,4,5,1; 3,5,1,4; 3,5,2,3; 3,5,3,2; 3,5,4,1; 3,6,1,3; 3,6,2,2; 3,6,3,1; 4,1,2,6; 4,1,3,5; 4,1,4,4; 4,1,5,3; 4,1,6,2; 4,2,1,6; 4,2,2,5; 4,2,3,4; 4,2,4,3; 4,2,5,2; 4,2,6,1; 4,3,1,5; 4,3,2,4; 4,3,3,3; 4,3,4,2; 4,3,5,1; 4,4,1,4; 4,4,2,3; 4,4,3,2; 4,4,4,1; 4,5,1,3; 4,5,2,2; 4,5,3,1; 4,6,1,2; 4,6,2,1; 5,1,1,6; 5,1,2,5; 5,1,3,4; 5,1,4,3; 5,1,5,2; 5,1,6,1; 5,2,1,5; 5,2,2,4; 5,2,3,3; 5,2,4,2; 5,2,5,1; 5,3,1,4; 5,3,2,3; 5,3,3,2; 5,3,4,1; 5,4,1,3; 5,4,2,2; 5,4,3,1; 5,5,1,2; 5,5,2,1; 5,6,1,1; 6,1,1,5; 6,1,2,4; 6,1,3,3; 6,1,4,2; 6,1,5,1; 6,2,1,4; 6,2,2,3; 6,2,3,2; 6,2,4,1; 6,3,1,3; 6,3,2,2; 6,3,3,1; 6,4,1,2; 6,4,2,1; 6,5,1,1 1,1,6,6; 1,2,5,6; 1,2,6,5; 14 146 0.112654 1,3,4,6; 1,3,5,5; 1,3,6,4; 1,4,3,6; 1,4,4,5; 1,4,5,4; 1,4,6,3; 1,5,2,6; 1,5,3,5; 1,5,4,4; 1,5,5,3; 1,5,6,2; 1,6,1,6; 1,6,2,5; 1,6,3,4; 1,6,4,3; 1,6,5,2; 1,6,6,1; 2,1,5,6; 2,1,6,5; 2,2,4,6; 2,2,5,5; 2,2,6,4; 2,3,3,6; 2,3,4,5; 2,3,5,4; 2,3,6,3; 2,4,2,6; 2,4,3,5; 2,4,4,4; 2,4,5,3; 2,4,6,2; 2,5,1,6; 2,5,2,5; 2,5,3,4; 2,5,4,3; 2,5,5,2; 2,5,6,1; 2,6,1,5; 2,6,2,4; 2,6,3,3; 2,6,4,2; 2,6,5,1; 3,1,4,6; 3,1,5,5; 3,1,6,4; 3,2,3,6; 3,2,4,5; 3,2,5,4; 3,2,6,3; 3,3,2,6; 3,3,3,5; 3,3,4,4; 3,3,5,3; 3,3,6,2; 3,4,1,6; 3,4,2,5;

107

Page 111: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

3,4,3,4; 3,4,4,3; 3,4,5,2; 3,4,6,1; 3,5,1,5; 3,5,2,4; 3,5,3,3; 3,5,4,2; 3,5,5,1; 3,6,1,4; 3,6,2,3; 3,6,3,2; 3,6,4,1; 4,1,3,6; 4,1,4,5; 4,1,5,4; 4,1,6,3; 4,2,2,6; 4,2,3,5; 4,2,4,4; 4,2,5,3; 4,2,6,2; 4,3,1,6; 4,3,2,5; 4,3,3,4; 4,3,4,3; 4,3,5,2; 4,3,6,1; 4,4,1,5; 4,4,2,4; 4,4,3,3; 4,4,4,2; 4,4,5,1; 4,5,1,4; 4,5,2,3; 4,5,3,2; 4,5,4,1; 4,6,1,3; 4,6,2,2; 4,6,3,1; 5,1,2,6; 5,1,3,5; 5,1,4,4; 5,1,5,3; 5,1,6,2; 5,2,1,6; 5,2,2,5; 5,2,3,4; 5,2,4,3; 5,2,5,2; 5,2,6,1; 5,3,1,5; 5,3,2,4; 5,3,3,3; 5,3,4,2; 5,3,5,1; 5,4,1,4; 5,4,2,3; 5,4,3,2; 5,4,4,1; 5,5,1,3; 5,5,2,2; 5,5,3,1; 5,6,1,2; 5,6,2,1; 6,1,1,6; 6,1,2,5; 6,1,3,4; 6,1,4,3; 6,1,5,2; 6,1,6,1; 6,2,1,5; 6,2,2,4; 6,2,3,3; 6,2,4,2; 6,2,5,1; 6,3,1,4; 6,3,2,3; 6,3,3,2; 6,3,4,1; 6,4,1,3; 6,4,2,2; 6,4,3,1; 6,5,1,2; 6,5,2,1; 6,6,1,1 1,2,6,6; 1,3,5,6; 1,3,6,5; 15 140 0.108025 1,4,4,6; 1,4,5,5; 1,4,6,4; 1,5,3,6; 1,5,4,5; 1,5,5,4; 1,5,6,3; 1,6,2,6; 1,6,3,5; 1,6,4,4; 1,6,5,3; 1,6,6,2; 2,1,6,6; 2,2,5,6; 2,2,6,5; 2,3,4,6; 2,3,5,5; 2,3,6,4; 2,4,3,6; 2,4,4,5; 2,4,5,4; 2,4,6,3; 2,5,2,6; 2,5,3,5; 2,5,4,4; 2,5,5,3; 2,5,6,2; 2,6,1,6; 2,6,2,5; 2,6,3,4; 2,6,4,3; 2,6,5,2; 2,6,6,1; 3,1,5,6; 3,1,6,5; 3,2,4,6; 3,2,5,5; 3,2,6,4; 3,3,3,6; 3,3 4,5; 3,3,5,4; 3,3,6,3; 3,4,2,6; 3,4,3,5; 3,4,4,4;

108

Page 112: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

3,4,5,3; 3,4,6,2; 3,5,1,6; 3,5,2,5; 3,5,3,4; 3,5,4,3; 3,5,5,2; 3,5,6,1; 3,6,1,5; 3,6,2,4; 3,6,3,3; 3,6,4,2; 3,6,5,1; 4,1,4,6; 4,1,5,5; 4,1,6,4; 4,2,3,6; 4,2,4,5; 4,2,5,4; 4,2,6,3; 4,3,2,6; 4,3,3,5; 4,3,4,4; 4,3,5,3; 4,3,6,2; 4,4,1,6; 4,4,2,5; 4,4,3,4; 4,4,4,3; 4,4,5,2; 4,4,6,1; 4,5,1,5; 4,5,2,4; 4,5,3,3; 4,5,4,2; 4,5,5,1; 4,6,1,4; 4,6,2,3; 4,6,3,2; 4,6,4,1; 5,1,3,6; 5,1,4,5; 5,1,5,4; 5,1,6,3; 5,2,2,6; 5,2,3,5; 5,2,4,4; 5,2,5,3; 5,2,6,2; 5,3,1,6; 5,3,2,5; 5,3,3,4; 5,3,4,3; 5,3,5,2; 5,3,6,1; 5,4,1,5; 5,4,2,4; 5,4,3,3; 5,4,4,2; 5,4,5,1; 5,5,1,4; 5,5,2,3; 5,5,3,2; 5,5,4,1; 5,6,1,3; 5,6,2,2; 5,6,3,1; 6,1,2,6; 6,1,3,5; 6,1,4,4; 6,1,5,3; 6,1,6,2; 6,2,1,6; 6,2,2,5; 6,2,3,4; 6,2,4,3; 6,2,5,2; 6,2,6,1; 6,3,1,5; 6,3,2,4; 6,3,3,3; 6,3,4,2; 6,3,5,1; 6,4,1,4; 6,4,2,3; 6,4,3,2; 6,4,4,1; 6,5,1,3; 6,5,2,2; 6,5,3,1; 6,6,1,2; 6,6,2,1 1,3,6,6; 1,4,5,6; 1,4,6,5; 16 125 0.096451 1,5,4,6; 1,5,5,5; 1,5,6,4; 1,6,3,6; 1,6,4,5; 1,6,5,4; 1,6,6,3; 2,2,6,6; 2,3,5,6; 2,3,6,5; 2,4,4,6; 2,4,5,5; 2,4,6,4; 2,5,3,6; 2,5,4,5; 2,5,5,4; 2,5,6,3; 2,6,2,6; 2,6,3,5; 2,6,4,4; 2,6,5,3; 2,6,6,2; 3,1,6,6; 3,2,5,6; 3,2,6,5; 3,3,4,6; 3,3,5,5; 3,3,6,4; 3,4,3,6; 3,4,4,5; 3,4,5,4; 3,4,6,3; 3,5,2,6; 3,5,3,5; 3,5,4,4; 3,5,5,3; 3,5,6,2; 3,6,1,6; 3,6,2,5;

109

Page 113: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

3,6,3,4; 3,6,4,3; 3,6,5,2; 3,6,6,1; 4,1,5,6; 4,1,6,5; 4,2,4,6; 4,2,5,5; 4,2,6,4; 4,3,3,6; 4,3,4,5; 4,3,5,4; 4,3,6,3; 4,4,2,6; 4,4,3,5; 4,4,4,4; 4,4,5,3; 4,4,6,2; 4,5,1,6; 4,5,2,5; 4,5,3,4; 4,5,4,3; 4,5,5,2; 4,5,6,1; 4,6,1,5; 4,6,2,4; 4,6,3,3; 4,6,4,2; 4,6,5,1; 5,1,4,6; 5,1,5,5; 5,1,6,4; 5,2,3,6; 5,2,4,5; 5,2,5,4; 5,2,6,3; 5,3,2,6; 5,3,3,5; 5,3,4,4; 5,3,5,3; 5,3,6,2; 5,4,1,6; 5,4,2,5; 5,4,3,4; 5,4,4,3; 5,4,5,2; 5,4,6,1; 5,5,1,5; 5,5,2,4; 5,5,3,3; 5,5,4,2; 5,5,5,1; 5,6,1,4; 5,6,2,3; 5,6,3,2; 5,6,4,1; 6,1,3,6; 6,1,4,5; 6,1,5,4; 6,1,6,3; 6,2,2,6; 6,2,3,5; 6,2,4,4; 6,2,5,3; 6,2,6,2; 6,3,1,6; 6,3,2,5; 6,3,3,4; 6,3,4,3; 6,3,5,2; 6,3,6,1; 6,4,1,5; 6,4,2,4; 6,4,3,3; 6,4,4,2; 6,4,5,1; 6,5,1,4; 6,5,2,3; 6,5,3,2; 6,5,4,1; 6,6,1,3; 6,6,2,2; 6,6,3,1 1,4,6,6; 1,5,5,6; 1,5,6,5; 17 104 0.080247 1,6,4,6; 1,6,5,5; 1,6,6,4; 2,3,6,6; 2,4,5,6; 2,4,6,5; 2,5,4,6; 2,5,5,5; 2,5,6,4; 2,6,3,6; 2,6,4,5; 2,6,5,4; 2,6,6,3; 3,2,6,6; 3,3,5,6; 3,3,6,5; 3,4,4,6; 3,4,5,5; 3,4,6,4; 3,5,3,6; 3,5,4,5; 3,5,5,4; 3,5,6,3; 3,6,2,6; 3,6,3,5; 3,6,4,4; 3,6,5,3; 3,6,6,2; 4,1,6,6; 4,2,5,6; 4,2,6,5; 4,3,4,6; 4,3,5,5; 4,3,6,4; 4,4,3,6; 4,4,4,5; 4,4,5,4; 4,4,6,3; 4,5,2,6; 4,5,3,5; 4,5,4,4; 4,5,5,3; 4,5,6,2; 4,6,1,6; 4,6,2,5; 4,6,3,4; 4,6,4,3; 4,6,5,2;

110

Page 114: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

4,6,6,1; 5,1,5,6; 5,1,6,5; 5,2,4,6; 5,2,5,5; 5,2,6,4; 5,3,3,6; 5,3,4,5; 5,3,5,4; 5,3,6,3; 5,4,2,6; 5,4,3,5; 5,4,4,4; 5,4,5,3; 5,4,6,2; 5,5,1,6; 5,5,2,5; 5,5,3,4; 5,5,4,3; 5,5,5,2; 5,5,6,1; 5,6,1,5; 5,6,2,4; 5,6,3,3; 5,6,4,2; 5,6,5,1; 6,1,4,6; 6,1,5,5; 6,1,6,4; 6,2,3,6; 6,2,4,5; 6,2,5,4; 6,2,6,3; 6,3,2,6; 6,3,3,5; 6,3,4,4; 6,3,5,3; 6,3,6,2; 6,4,1,6; 6,4,2,5; 6,4,3,4; 6,4,4,3; 6,4,5,2; 6,4,6,1; 6,5,1,5; 6,5,2,4; 6,5,3,3; 6,5,4,2; 6,5,5,1; 6,6,1,4; 6,6,2,3; 6,6,3,2; 6,6,4,1 6,6,1,5; 6,6,2,4; 6,6,3,3; 18 80 0.061728 6,6,4,2; 6,6,5,1; 6,5,1,6; 6,5,2,5; 6,5,3,4; 6,5,4,3; 6,5,5,2; 6,5,6,1; 6,4,2,6; 6,4,3,5; 6,4,4,4; 6,4,5,3; 6,4,6,2; 6,3,3,6; 6,3,4,5; 6,3,5,4; 6,3,6,3; 6,2,4,6; 6,2,5,5; 6,2,6,4; 6,1,5,6; 6,1,6,5; 5,6,1,6; 5,6,2,5; 5,6,3,4; 5,6,4,3; 5,6,5,2; 5,6,6,1; 5,5,2,6; 5,5,3,5; 5,5,4,4; 5,5,5,3; 5,5,6,2; 5,4,3,6; 5,4,4,5; 5,4,5,4; 5,4,6,3; 5,3,4,6; 5,3,5,5; 5,3,6,4; 5,2,5,6; 5,2,6,5; 5,1,6,6; 4,6,2,6; 4,6,3,5; 4,6,4,4; 4,6,5,3; 4,6,6,2; 4,5,3,6; 4,5,4,5; 4,5,5,4; 4,5,6,3; 4,4,4,6; 4,4,5,5; 4,4,6,4; 4,3,5,6; 4,3,6,5; 4,2,6,6; 3,6,3,6; 3,6,4,5; 3,6,5,4; 3,6,6,3; 3,5,4,6; 3,5,5,5; 3,5,6,4; 3,4,5,6; 3,4,6,5; 3,3,6,6; 2,6,4,6; 2,6,5,5; 2,6,6,4; 2,5,5,6; 2,5,6,5; 2,4,6,6; 1,6,5,6; 1,6,6,5; 1,5,6,6

111

Page 115: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

6,6,1,6; 6,6,2,5; 6,6,3,4; 19 56 0.043210 6,6,4,3; 6,6,5,2; 6,6,6,1; 6,5,2,6; 6,5,3,5; 6,5,4,4; 6,5,5,3; 6,5,6,2; 6,4,3,6; 6,4,4,5; 6,4,5,4; 6,4,6,3; 6,3,4,6; 6,3,5,5; 6,3,6,4; 6,2,5,6; 6,2,6,5; 6,1,6,6; 5,6,2,6; 5,6,3,5; 5,6,4,4; 5,6,5,3; 5,6,6,2; 5,5,3,6; 5,5,4,5; 5,5,5,4; 5,5,6,3; 5,4,4,6; 5,4,5,5; 5,4,6,4; 5,3,5,6; 5,3,6,5; 5,2,6,6; 4,6,3,6; 4,6,4,5; 4,6,5,4; 4,6,6,3; 4,5,4,6; 4,5,5,5; 4,5,6,4; 4,4,5,6; 4,4,6,5; 4,3,6,6; 3,6,4,6; 3,6,5,5; 3,6,6,4; 3,5,5,6; 3,5,6,5; 3,4,6,6; 2,6,5,6; 2,6,6,5; 2,5,6,6; 1,6,6,6 6,6,2,6; 6,6,3,5; 6,6,4,4; 20 35 0.027006 6,6,5,3; 6,6,6,2; 6,5,3,6; 6,5,4,5; 6,5,5,4; 6,5,6,3; 6,4,4,6; 6,4,5,5; 6,4,6,4; 6,3,5,6; 6,3,6,5; 6,2,6,6; 5,6,3,6; 5,6,4,5; 5,6,5,4; 5,6,6,3; 5,5,4,6; 5,5,5,5; 5,5,6,4; 5,4,5,6; 5,4,6,5; 5,3,6,6; 4,6,4,6; 4,6,5,5; 4,6,6,4; 4,5,5,6; 4,5,6,5; 4,4,6,6; 3,6,5,6; 3,6,6,5; 3,5,6,6; 2,6,6,6; 6,6,3,6; 6,6,4,5; 6,6,5,4; 21 20 0.015432 6,6,6,3; 6,5,4,6; 6,5,5,5; 6,5,6,4; 6,4,5,6; 6,4,6,5; 6,3,6,6; 5,6,4,6; 5,6,5,5; 5,6,6,4; 5,5,5,6; 5,5,6,5; 5,4,6,6; 4,6,5,6; 4,6,6,5; 4,5,6,6; 3,6,6,6 6,6,4,6; 6,6,5,5; 6,6,6,4; 22 10 0.007716 6,5,5,6; 6,5,6,5; 6,4,6,6; 5,6,5,6; 5,6,6,5; 5,5,6,6;

112

Page 116: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

4,6,6,6 6,6,6,5; 6,6,5,6; 6,5,6,6; 23 4 0.003086 5,6,6,6 6,6,6,6 24 1 0.000772

113

Page 117: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Table A31. Summary of frequency distribution for four dice.

S c f F Z Z f ˆ

0.000000 4 1 0.000772 0.001510 0.000772 5 4 0.003086 0.003455 0.003858 6 10 0.007716 0.007246 0.011574 –1.605467 –1.565423 7 20 0.015432 0.013929 * 0.027006 –1.359960 –1.356700 8 35 0.027006 0.024540 0.054012 –1.135382 –1.147977 9 56 0.043210 0.039627 0.097222 –0.917975 –0.939254 10 80 0.061728 0.058651 0.158950 –0.706891 –0.730531 11 104 0.080247 0.079564 0.239197 –0.501122 –0.521808 12 125 0.096451 0.098928 0.335648 –0.299664 –0.313085 13 140 0.108025 0.112740 0.443673 –0.100551 –0.104362 14 146 0.112654 0.117760 0556327 +0.100551 +0.104362 15 140 0.108025 0.112740 0.664352 +0.299664 +0.313085 16 125 0.096451 0.098928 0.760803 +0.501122 +0.521808 17 104 0.080247 0.079564 0.841050 +0.706891 +0.521808 18 80 0.061728 0.058651 0.902778 +0.917975 +0.730531 19 56 0.043210 0.039627 0.945988 +1.135382 +0.939254 20 35 0.027006 0.024540 0.972994 +1.359960 +1.147977 21 20 0.015432 0.013929 * 0.988426 +1.605467 +1.565423 22 10 0.007716 0.007246 0.996142 23 4 0.003086 0.003455 0.999228 24 1 0.000772 0.001510 1.000000

114

Page 118: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

C 1296 = 64

SSFZ 208723.09221.22

1

212erfˆ 1

r = 0.999770

0000.14,79104.42

22

79104.4

000.14exp11776.0

2exp

π2

1ˆ SSf

ff 07548.100512.0ˆ r = 0.99883 The frequency distribution for a set of dice more closely conforms to the continuous Gaussian distribution as the number of dice increases. Complexity of computing the discrete distribution increases dramatically with the number of dice, and becomes unwieldy beyond four dice. For five dice the total number of combinations is 65 = 7776! The two dice problem illustrates how well the continuous Gaussian distribution approximates the discrete distribution (triangular). Even though the approximation is not exact, it does bring in an analytic function which we have used in the model for plant growth. The peg board offers a simpler model of the frequency distribution than does a set of dice. Reference: Speyer, E. 1994. Six Roads from Newton: Great Discoveries in Physics. John Wiley & Sons. New York. NY.

115

Page 119: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

Derivatives for power functions (as defined by Cauchy) Function Derivative

0xy 011)( 00 xxxy

00

xx

y

0lim0

x

y

dx

dy

x

1xy xxxxxxxy 11)(

1

x

x

x

y

1lim0

x

y

dx

dy

x

2xy 222222 )(2)(2)( xxxxxxxxxxxy

xxx

xxx

x

y

2)(2 2

xx

y

dx

dy

x

2lim0

3xy

(xy

3223322333 )()(33)()(33) xxxxxxxxxxxxxx

22 )(33 xxxxx

y

2

0

3lim xx

y

dx

dy

x

4 xy 3223

43223444

)(4)(64

)(4)(64)(

xxxxxx

xxxxxxxxxxxy

223 )(464 xxxxxx

y

3

0

4lim xx

y

dx

dy

x

nxy 1 nnxdx

dy

116

Page 120: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

The reader may note that the binomial expansion with n = 0, 1, 2, leads to finite power series. Newton was the first to prove this. Then he showed that for n either negative or a fraction the expansion leads to an infinite power series (Berlinski, 2000, p. 30). This led to Newton’s first memoir in 1669 entitled On Analysis by Infinite Series, which preceded his development of calculus. It is common to write the general solution as an infinite power series

,3

0

44

33

2210

i

ii xaxaxaxaxaay

where ai are the expansion coefficients. Now the coefficients can be evaluated using the derivatives from calculus and the boundary values at x = 0. This leads to the following relationships

00)0( ayxy

10

adx

dy

x

02

2

22

02

2

!2

112

xxdx

ydaa

dx

yd

03

3

33

03

3

!3

1123

xxdx

ydaa

dx

yd

04

4

44

04

4

!4

11234

xxdx

ydaa

dx

yd

00!

1!

xn

n

nn

xn

n

dx

yd

naan

dx

yd

This leads finally to the infinite series

n

x

n

n

xx

xdx

yd

nx

dx

ydx

dx

dyyy

0

2

0

2

2

00 !

1

!2

1

!1

1

117

Page 121: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

may be recognized as the Taylor series, which was discovered in 1715 by Brook Taylor. This approach assumes that the derivatives exist. It also assumes that the series converges to a finite value for any value of x, or at least for a limited domain of x. Physics usually enters into the process by way of a differential equation with initial conditions at x = 0. Berlinski, D. 2000. Newton’s Gift: How Sir Isaac Newton Unlocked the System of the World.

Simon & Schuster. New York, NY. Example: Consider the first order differential equation

kdx

dy with the initial condition 0at0 xyy

where k is a constant. Solution:

kdx

dy

x

0

00

2

2

xdx

yd

000

4

4

03

3

x

n

n

xxdx

yd

dx

yd

dx

yd

The solution becomes

kxy

xdx

yd

nx

dx

ydx

dx

dyyy n

xn

n

xx

0

0

2

02

2

00 !

1

!2

1

!1

1

which is the equation of a straight line.

118

Page 122: A MEMOIR ON NONLINEAR REGRESSION MODELS Mathematical …

Allen R. Overman Nonlinear Models

119

Example: Consider the first order differential equation

kydx

dy with the initial condition 0at0 xyy

where k is a constant. Solution:

00

kydx

dy

x

02

002

2

ykdx

dyk

dx

yd

xx

03

03

3

ykdx

yd

x

0

0

ykdx

yd n

xn

n

The solution becomes

n

xn

n

xx

xdx

yd

nx

dx

ydx

dx

dyyy

0

2

02

2

00 !

1

!2

1

!1

1

nn xykn

xykxkyy 02

02

00 !

1

!2

1

nkxn

kxkxy!

1

!2

1

!1

11 21

0

kxy exp0

where

nkxn

kxkxkx!

1

!2

1

!1

11exp 2

as defined by Euler.