Section 11.5 Inferences Concerning the Regression Coefficients

__Confidence Interval for b
, the slope parameter__

A (1-*a*)100% confidence
interval for the parameter *b *in the regression line *m _{Y}_{|x}* =

_{}

where _{} is a value of the *t*-distribution
with *n-2* degrees of freedom

__Hypothesis testing on the slope parameter, ____b__

To test the null hypothesis H_{o}:*b*
= *b*_{o}
against any suitable alternatives,

use t-distribution with n-2 degrees of freedom to define the critical region, and the following test statistic

_{}

_{}

Statistical Inference on the intercept

__Confidence Interval for a
, the intercept parameter__

A (1-*a*)100% confidence
interval for the parameter *a *in the
regression line *m _{Y}_{|x}* =

_{} or

_{}

where _{} is a value of the *t*-distribution
with *n-2* degrees of freedom

__Hypothesis testing on the intercept parameter, ____a__

To test the null hypothesis H_{o}:* **a*
= *a*_{o}
against any suitable alternatives,

use t-distribution with n-2 degrees of freedom to define the critical region, and the following test statistic

_{}

df=33-2=31, *P*-value =
_{}

From Table A.4 directly, *P*-value
< 0.05. So, the null hypothesis of zero intercept is rejected at 0.05 level of significance.

__A measure of quality of fit: Coefficient of
Determination, R^{2}__

Coefficient of Determination, *R*^{2} =
proportion of total variability in the dependent variable Y explained by the
fitted model.

*TSS* = *SSR*
+ *SSE*

Coefficient
of Determination, _{}

*R*^{2} = 1.0 if fit is perfect

*R*^{2} = 0.0 if
fit is poor

**Section 11.6 Prediction**

__Confidence Interval for _{} , the mean response__

A (1-*a*)100% confidence
interval for the mean response * _{} *in the regression line

_{}

where _{} is a value of the *t*-distribution
with *n-2* degrees of freedom

__Prediction Interval for _{} , the future value of a response__

A (1-*a*)100% prediction
interval for a single response * _{} *in the regression line

_{}

where _{} is a value of the *t*-distribution
with *n-2* degrees of freedom

**Section 11.11 Simple Linear Regression
Case Study**

A more complicated model may be more appropriate

The higher *R*^{2} value would suggest that the
transformed model is more appropriate.

**Section 11.12
Correlation**

__Correlation coefficient__

The measure *r* of linear association between two variables *X*
and *Y* is estimated by the **sample correlation coefficient ***r*,
where

_{}

-1 __<__ r __<__ 1

*r* = 1.0 or –1.0 perfect linear relationship

*r* = 0.0 no
linear relationship

Sample coefficient of determination

_{}

represents
the proportion of the variation in *TSS* explained by the regression of *Y*
on *x*, namely, *SSR*.

__Hypothesis testing on the correlation coefficient, ____r__

To test the null hypothesis H_{o}:* **r*
= *r*_{o}
against any suitable alternatives,

We can use t-distribution with n-2 degrees of freedom to define the critical region, and the following test statistic

_{}

but
this test has problem when *r values *close to -1 or 1

However, a more general test is given by

_{} the
approximate normal distribution with mean _{}and variance _{}.

We can use the standard normal to define the critical region, and the following test statistic

_{}

Important:
Correlation is a measure of __linear__ relationship, so r = 0 does not
necessarily mean there is no relationship between two variables.