Date post: | 24-Oct-2014 |
Category: |
Documents |
Upload: | diane-katz-ziegman |
View: | 31 times |
Download: | 3 times |
Chapter 13 - Simple Linear Regression Analysis
CHAPTER 13—Simple Linear Regression Analysis
13.1 The “best” line that can be fitted to the observed data. The slope and the intercept of the least squares line.
LO2
13.2 Because we do not know how y and x are related outside the experimental region.
LO1, LO2
13.3 a. b0 = 15.84, b1 = -0.1279
b. b1 = -179.6475 / 1404.355 = -0.1279, b0 = 10.2125 – (-0.1279*43.98) = 15.84
c. Both the estimate of the mean and the prediction of the point are calculated by putting the temperature of 40 into the equation = 15.84 – 0.1279*40 = 10.724 MMcf.
LO1, LO2
13.4 a. b0 = 14.82 b1 = 5.707
The interpretation of b0 is the starting salary of someone with a GPA of 0.
The interpretation of b1 is for each increase in GPA of 1, salary goes up $5,707
No. The interpretation of b0 does not make practical sense since it indicates that someone with a GPA = 0 would have a starting salary of $14,816, when in fact they would not have graduated with a GPA = 0.
b. y = 14.82 + 5.707(3.25) = 33.36775
That is, $33,367.75
LO1, LO2
13.5 a. b0 = 11.4641 b1 = 24.6022
b0 – 0 copiers, 11.46 minutes of service.
b1 – each additional copier adds 24.6022 minutes of service on average.
No. The interpretation of b0 does not make practical sense since it indicates that 11.46 minutes of service would be required for a customer with no copiers.
b. y = 11.4641 + 24.6022(4) = 109.873, or 109.9 minutes
LO1, LO2
13.6 a. b0 = 7.814 b1 = 2.665
b0 – 0 price difference yields demand of 7.814.
13-1
Chapter 13 - Simple Linear Regression Analysis
13-2
Chapter 13 - Simple Linear Regression Analysis
b1 – each increase in 1 of price difference increases demand on average by 2.665.
Yes. The interpretation of b0 does make practical sense since it indicates that 781,409 bottles of detergent would be demanded when the price difference with other products is zero.
b. = 7.814 + 2.665 (.10) = 8.0805
LO1, LO2
13.7 a.
xi yi x i2
xi yi
5 71 25 355
62 663 3844 41106
35 381 1225 13335
12 138 144 1656
83 861 6889 71463
14 145 196 2030
46 493 2116 22678
52 548 2704 28496
23 251 529 5773
100 1024 10000 102400
41 435 1681 17835
75 772 5625 57900
13-3
Chapter 13 - Simple Linear Regression Analysis
SSxy=∑ x i yi−(∑ x i )(∑ yi)n
¿365 , 027−(548 )(5 , 782 )12
=100 , 982. 33
SSxx=∑ x i2−
(∑ x i )2
n
¿34 , 978−(548 )2
12=9 , 952. 667
b1=SSxy
SSxx
=100 , 982. 339 ,952. 667
=10. 1463
b0= y−b1 x=(578212 )−10 . 1463(548
12 )=18 . 4875
13-4
Chapter 13 - Simple Linear Regression Analysis
b. is the estimated increase in mean labor cost (10.1463) for every 1 unit increase in the batch size.
is the estimated mean labor cost (18.4875) when batch size = 0; no.
c.
d. = 18.4880 + 10.1463(60) = 627.266
LO1, LO2
13.8 a. MINITAB output
Regression Analysis: Sale Price versus Size
The regression equation isSale Price = 48.02 + 5.70 Size
b1 = SSxy / SSxx = 1149.18 / 201.6 = 5.70.
b0 = y-bar – b1*x-bar = 155.19 – 5.70*18.8 = 48.03 which is within rounding 48.02.
b. is the estimated increase in mean sales price (5.700) for every hundred square foot increase in home size.
is the estimated mean sales price when square footage = 0. No, the interpretation of makes no practical sense.
c. y=48 . 02+5 .700 x .
d. = 48.02 + 5.700 (20) = 162.02.
That is, $162,020.
LO1, LO2
13.9 (1) Mean of error terms = 0(2) Constant variance(3) Normality(4) Independence
LO3
13-5
Chapter 13 - Simple Linear Regression Analysis
13.10 That is, the constant variance and standard deviation of the error term populations.
LO3
13.11
LO3
13.12
LO3
13.13
LO3
13.14s2=2. 8059
28=. 1002 , s=√s2=. 3166
LO3
13.15
s2=746 .762410
=74 .67624 , s=8 .64154
LO3
13.16
LO3
13.17
13-6
s2=SSEn−2
=2 .5688−2
=0 . 428
s=√s2=√0 . 428=0 .654
s2=SSEn−2
=1 .4387−2
=0 .2876
s=√s2=√0 . 2876=0 . 5363
s2=SSEn−2
=191 .701711−2
=21 .3002
s=√s2=√21 .30018=4 . 61521
s2=SSEn−2
=896 .810−2
=112.1
s=√s2=√112.1=10 .58773
s2=SSEn−2
=222 .824210−2
=27 . 8530
s=√s2=√27 .8530=5 .2776
Chapter 13 - Simple Linear Regression Analysis
LO3
13.18 Strong ( = .05) or very strong ( = .01) evidence that the regression relationship is significant.
LO4
13.19 Explanations will vary.
LO4
13.20 a. = 15.8379 = -0.1279
b. SSE = 2.5679 s2 = 0.4280 s = 0.6542
c.sb1 = 0.0175 t = -7.3277
t = / sb1 = -0.1279 /0.0175 = -7.3277
d. df = 6 t.025 = 2.447 Reject , Strong evidence of a significant relationship between x and y.
e. t.005 = 3.707 Reject , Very strong evidence of a significant relationship between x and y.
f. p-value =.0003 Reject at all , Extremely strong evidence of a significant relationship between x and y.
g. 95% Cl: [ ± t.025 sb1 ] = -0.1279 ± (2.447)(0.0175) = [-0.1706, -0.0852]
We are 95% confident that the average fuel consumption decreases by between 0.0852 MMcf and 0.1706 MMcf for each 1 degree increase in monthly temperature.
h. 99% Cl: [ ± t.005 sb1 ] = -0.1279 ± (3.707)(0.0175) = [-0.1928, -0.0630]
We are 99% confident that the average fuel consumption decreases by between 0.0630 MMcf and 0.1928 MMcf for each 1 degree increase in monthly temperature.
13-7
Chapter 13 - Simple Linear Regression Analysis
i.sb0 = 0.8018 t = 19.7535
t = / sb0 = 15.8379 / 0.8018 = 19.7535
j. p-value < 0.000 Reject at all , Extremely strong evidence that the y-intercept is significant.
k.
LO2, LO3, LO4
13.21 a. = 14.816 = 5.7066
b. SSE = 1.438 s2 = .288 s = .5363
c.sb1 = .3953 t = 14.44
t = / sb1 = 5.7066 /.3953 = 14.44
d. df = 5 t.025 = 2.571 Reject , Strong evidence of a significant relationship between x and y.
e. t.005 = 4.032 Reject , Very strong evidence of a significant relationship between x and y.
f. p-value =.000 Reject at all , Extremely strong evidence of a significant relationship between x and y.
g. 95% Cl: [ ± t.025 sb1 ] = 5.7066 ± (2.571)(.3953) = [4.690, 6.723]
We are 95% confident that the mean starting salary increases by between $4690 and $6723 for each 1.0 increase in GPA.
h. 99% Cl: [ ± t.005 sb1 ] = 5.7066 ± (4.032)(.3953) = [4.113, 7.300]
We are 99% confident that the mean starting salary increases by between $4113 and $7300 for each 1.0 increase in GPA.
i.sb0 = 1.235 t = 12.00
t = / sb0 = 14.816 / 1.235 = 12.00
13-8
sb1=
s
√SSxx
=0 .6542
√1404 .355=0 . 01746
sb0=s√1
n+ x2
SSxx
=. 0 . 6542√18
+(43 .98 )2
1404 .355=0 .8018
Chapter 13 - Simple Linear Regression Analysis
j. p-value = .000 Reject at all , Extremely strong evidence that the y-intercept is significant.
k.
LO2, LO3, LO4
13.22 a. = 11.4641 = 24.6022
b. SSE = 191.7017s2 = 21.3002 s = 4.615
c.sb1 = .8045 t = 30.580
t = / sb1 = 24.602 /.8045 = 30.580
d. df = 9 t.025 = 2.262 Reject , strong evidence of a significant relationship between x and y.
e. t.005 = 3.250 Reject , very strong evidence of a significant relationship between x and y.
f. p-value = .000 Reject at all , extremely strong evidence of a significant relationship between x and y.
g. [24.6022 ± 2.262(.8045)] = [22.782, 26.422]
h. [24.6022 ± 3.250(.8045)] = [21.987,27.217]
i.sb0 = 3.4390 t = 3.334
t = / sb0 = 11.464 / 3.439 = 3.334
j. p-value = .0087 Reject at all except .001
k.
LO2, LO3, LO4
13-9
sb1=
s
√SSxx
=.5363
√1.8407=. 3953
sb0=s√1
n+ x2
SSxx
=. 5363√17
+(3 .0814 )2
1 .8407=1. 235
sb1=
s
√SSxx
=4 .61521
√32. 909=. 8045
sb0=s√1
n+ x2
SSxx
=4 . 61521√111
+3. 9092
32. 909=3 . 439
Chapter 13 - Simple Linear Regression Analysis
13.23 See the solutions to 13.20 for guidance.
a.
b. SSE=2 .806 , s2=. 100 , s=. 3166
c.sb1
=. 2585 , t=10. 31
d. Reject .
e. Reject .
f. p-value = less than .001; reject at each value of
g. [2.665 ± 2.048(.2585)] = [2.136, 3.194]
h. [2.665 ± 2.763(.2585)] = [1.951, 3.379]
i.sb0
=. 0799 , t=97 . 82
j. p-value = less than .001; reject .
k.
LO2, LO3, LO4
13.24 See the solutions to 13.20 for guidance.
a.
b. SSE = 746.7624, s2=74 . 67624 , s = 8.642
c.sb1
=. 0866 , t=117 .1344
d. Reject .
e. Reject .
f. p-value = .000; reject at each value of
g. [10.1463 ± 2.228(.0866)] = [9.953, 10.339]
h. [10.1463 ± 3.169(.0866)] = [9.872, 10.421]
i.
13-10
sb1=
s
√SSxx
=.31656
√1.49967=. 2585
sb0=s√1
n+ x2
SSxx
=. 31656√130
+. 21332
1 . 49967=. 079883
Chapter 13 - Simple Linear Regression Analysis
j. p-value = .003; fail to reject at = .001. Reject at all other values of
k.
LO2, LO3, LO4
13.25 See the solutions to 13.20 for guidance.
a. b0 = 48.02 = 5.7003
b. SSE = 896.8 s2 = 112.1 s = 10.588
c.sb1 = .7457 t = 7.64
t = / sb1 = 5.7003 /.7457 = 7.64
d. df = 8 t.025 = 2.306 Reject
e. t.005 = 3.355 Reject
f. p-value = .000 Reject at all
g. [3.9807,7.4199]
h. [3.198,8.202]
i.sb0 = 14.41 t = 3.33
t = b0 / sb0 = 48.02 / 14.41 = 3.33
j. p-value = .010 Reject at all except .01 and .001
k.
LO2, LO3, LO4
13-11
sb1=
s
√SSxx
=8 .64154
√9952.667=. 086621
sb0=s√1
n+ x2
SSxx
=8. 64154√112
+45. 6672
9952 .667=4 .67658
sb1=
s
√SSxx
=10 .588
√201. 6=.7457
sb0=s√1
10+ x2
SSxx
=10 . 588√110
+18 . 82
201.6=14 . 41
Chapter 13 - Simple Linear Regression Analysis
13.26 Find sb1 from Minitab
The regression equation issales = 66.2 + 4.43 ad exp
Predictor Coef SE Coef T PConstant 66.212 5.767 11.48 0.000Ad exp 4.4303 0.5810 7.62 0.000
95% C.I. for β1 [ 4.4303 2.306(.5810) ] = [3.091,5.770]
LO4
13.27 a. b1 = 1.2731. If MeanTaste goes up by 1 MeanPreference will go up by 1.2731.
b. (0.9885, 1.5577). We are 95% confident that this interval contains the true slope.
LO2, LO4
13.28 A confidence interval is for the mean value of y. A prediction interval is for an individual value of y.
LO5
13.29 The distance between xo and x , the average of the previously observed values of x.
LO5
13.30 a. 10.721, [10.130, 11.312]
b. 10.721, [9.015, 12.427]
c. dv = 1/8 + (40-43.98)2 / 1404.355 = 0.1363; dv = (0.241 / 0.6542)2 = 0.1357
d. CI: 15.84 -0.1279 * 40 ± 2.447*0.6542*sqrt(0.1363) = [10.13, 11.31]
PI: 15.84 -0.1279 * 40 ± 2.447*0.6542*sqrt(1.1363) = [9.01, 12.43]
e. Since we are predicting fuel consumption for one day when the average temperature is 40 degrees we must use the prediction interval. Since 9.01 < 9.595 and 12.43 > 11.847 the city cannot be 95% confident it will not pay a fine. For the city to be at least 95% confident the PI would have to be inside the interval [9.595, 11.847].
LO5
13-12
Chapter 13 - Simple Linear Regression Analysis
13.31 a. 33.362, [32.813, 33.911]
b. 33.362, [31.878, 34.846]
c.Distance Value=1
7+
(3. 25−3. 0814 )2
1 .8407=. 1583
d. [33.362 ± 2.571(.5363)√ .1583 ] = [32.813, 33.911]
[33.362 ± 2.571(.5363)√1+.1583 ] = [31.878, 34.846]
LO5
13.32 a. 109.873, [106.721, 113.025]
b. 109.873, [98.967, 120.779]
c. We have x = 4, x=3 .90 , SSxx=32 .90 , n=11
distance value = 1
11+(4−3 . 90 )2
32 .90=0 . 090657961
So confidence interval is:
109 . 873±(2. 262)( 4 . 615)√0. 090657961= [106 . 729 ,113 .016 ]this compares (within rounding) to the computer generated output.
For the prediction interval with the same quantities we get
109 . 873±(2. 262)( 4 . 615)√1 . 090657961
= [98.971, 120.775] which also compares within rounding.
d. 113 minutes
LO5
13.33 a. 8.0806; [7.948, 8.213]
b. 8.0806; [7.419, 8.743]
c. dv = (.0648/.316561)2 = .0419
d.s√dist=. 065 , s=. 3166 , dist=(. 065
. 3166 )2
=. 04215
99% C.I.: [8.0806 ± 2.763(.065)] = [7.9016, 8.2596]
13-13
Chapter 13 - Simple Linear Regression Analysis
99% P.I.: [8 .0806±2.763( . 3166)√1. 04215 ] = [7 .1877 , 8. 9735 ]
e. (1) 8.4804; [8.360, 8.600](2) 8.4804; [7.821, 9.140]
(3) s√dist=. 059 , s=. 3166 , dist=(. 059
. 3166 )2
=. 03427
99% C.I.: [8.4804 ± 2.763(.059)] = [8.3857, 8.5251]
99% P.I.: [8 .4804±2 .763 (. 3166 )√1 . 03473 ] = [7 .5909 ,9 . 3699 ]
LO5
13.34 a. 627.26, [621.05, 633.47]
b. 627.26, [607.03, 647.49]
c.s√dist=2. 7868 , s=8 . 642, dist=( 2 . 79
8 . 642 )2
=.104000
99% C.I.: [627.26 ± 3.169(2.79)] = [(618.42, 636.10)]
99% P.I.: [627 . 26±3.169(8 .642)√1 .104227 ] = [598 . 48 , 656 .04 ]
LO5
13.35 a. 162.03, [154.04, 170.02]
b. 162.03, [136.34, 187.72]
c. Prediction interval, because it deals with individuals, not an average.
LO5
13.36 Total variation: measures the total amount of variation exhibited by the observed values of y. Unexplained variation: measures the amount of variation in the values of y that is not explained by the model (predictor variable).Explained variation: measures the amount of variation in the values of y that is explained by the predictor variable.
LO6
13.37 Proportion of the total variation in the n observed values of y that is explained by the simple linear regression model.
LO6
13.38 Explained variation = 25.549 – 2.568 = 22.981
r2 = 22.981 / 25.549 = 0.899
r = +sqrt(0.977)*the sign of b1 = -0.948
13-14
Chapter 13 - Simple Linear Regression Analysis
89.9% of the variation in fuel consumption can be explained by variation in average temperature.
LO6
13.39 Explained variation = 61.38 – 1.438 = 59.942
r2 = 59.942 / 61.38 = 0.977
r = +sqrt(0.977) = 0.988
97.7% of the variation in starting salary can be explained by variation in GPA.
LO6
13.40 Explained variation = 20,110.5445 – 191.7017 = 19918.8428
r2 = 19918.8428 / 20110.5445 = 0.990
r = +sqrt(0.990) = 0.995
99% of the variation in service time can be explained by variation in number of copiers repaired.
LO6
13.41 Explained variation = 13.459 – 2.806 = 10.653
r2 = 10.653 / 13.459 = 0.792
r = +sqrt(0.792) = 0.890
79.2% of the variation in demand can be explained by variation in price differential.
LO6
13.42 Explained variation = 1,025,339.6667 – 746.7624 = 1,024,592.904
r2 = 1024592.904 / 1025339.6667 == 0.999
r = +sqrt(0.999) = 0.9995
99.9% of the variation in direct labor can be explained by variation in batch size.
LO6
13.43 Explained variation = 7447.5 – 896.8 = 6550.7
r2 = 6550.7 / 7447.5 = 0.88
r = +sqrt(0.88) = 0.938
88% of the variation in sales price can be explained by variation in square footage.
LO6
13.44 is the actual (unknown) value of the correlation between two variables.
LO7
13.45 Calculate t =
r √n−2
√1−r2
and obtain its associated p-value. If p-value < then you reject.
LO7
13-15
Chapter 13 - Simple Linear Regression Analysis
13.46 Reject H0 at all four values of .
LO7
13.47 Reject H0 at all four values of .
LO7
13.48 .
LO8
13.49 t–test on
LO8
13.50 a. F = 22.9808 / (2.5679 / 6)) = 53.6949
b. F.05 = 5.99 df1 = 1, df2 = 6
Since 53.6949 > 5.99, reject H0 with strong evidence of a significant relationship between x and y.
c. F.01 = 13.75 df1 = 1, df2 = 6
Since 53.6949 > 13.75, reject H0 with very strong evidence of a significant relationship between x and y.
d. p-value =0.0003; Reject H0 at all levels of , extremely strong evidence of a significant relationship between x and y.
e. t2 = (-7.33)2 = 53.7289 (approximately equals F = 53.6949)
(t.025)2 = (2.447)2 = 5.99 = F.05
LO8
13.51 a. F = 59.942 / (1.438 / 5) = 208.39
b. F.05 = 6.61 df1 = 1, df2 = 5
Since 208.39 > 6.61, reject H0 with strong evidence of a significant relationship between x and y.
c. F.01 = 16.26 df1 = 1, df2 = 5
Since 208.39 > 16.26, reject H0 with very strong evidence of a significant relationship between x and y.
d. p-value =.000; Reject H0 at all levels of , extremely strong evidence of a significant relationship between x and y.
e. t2 = (14.44)2 = 208.51 (approximately equals F = 208.39)
(t.025)2 = (2.571)2 = 6.61 = F.05
13-16
Chapter 13 - Simple Linear Regression Analysis
13-17
Chapter 13 - Simple Linear Regression Analysis
LO8
13.52 a. F = 19918.844 / (191.7017 / 9) = 935.149
b. F.05 = 5.12 df1 = 1, df2 = 9
Since 935.149 > 5.12, reject H0 with strong evidence of a significant relationship between x and y.
c. F.01 = 10.56 df1 = 1, df2 = 9
Since 935.149 > 10.56, reject H0 with very strong evidence of a significant relationship between x and y.
d. p-value =less than .001; Reject H0 at all levels of , extremely strong evidence of a significant relationship between x and y.
e. t2 = (30.58)2 = 935.14 (approximately equals F = 935.149)
(t.025)2 = (2.262)2 = 5.12 = F.05
LO8
13.53 a. F = 106.303
b. F.05 =4.20, reject (df1 = 1, df2 = 28). Strong evidence of a significant relationship between x and y.
c. F.01 =7.64, reject (df1 = 1, df2 = 28). Very strong evidence of a significant relationship between x and y.
d. p-value = less than .001, reject . Extremely strong evidence of a significant relationship between x and y.
e. (within rounding error)
(t.025)2 = 4.19 = F.05
LO8
13.54 a. F = 13,720.47
b. Reject .
c. Reject .
d. p-value = .000; reject .
e. (within rounding error)
LO8
13.55 a. F = 6550.7 / (896.8 / 8) = 58.43
13-18
Chapter 13 - Simple Linear Regression Analysis
13-19
Chapter 13 - Simple Linear Regression Analysis
b. F.05 = 5.32 df1 = 1, df2 = 8
Since 58.43 > 5.32, reject H0.
c. F.01 = 11.26 df1 = 1, df2 = 8
Since 58.43 > 11.3, reject H0.
d. p-value =.000; Reject H0 at all levels of
e. t2 = (7.64)2 = 58.37 (approximately equals F = 58.43)
(t.025)2 = (2.306)2 = 5.32 = F.05
LO8
13.56 They should be plotted against the independent variable and against y . Funneling or curved patterns indicate violations of the regression assumptions.
LO9
13.57 Create a histogram, stem-and-leaf, and normal plot.
LO9
13.58 Transforming the dependent variable.
LO9
13.59 Approximate horizontal band appearance. No violations indicated.
LO9
13.60 Possible violations of the normality and constant variance assumptions.
LO9
13.61 No.
LO9
13.62 a.
b. No
13-20
3( i)−13 n+1
=3( 4 )−133+1
=.3235
. 5000−. 3235=. 1765 ,⇒ z=−. 46
3( i)−13 n+1
=3(10 )−133+1
=. 8529
. 8529−. 5000=. 3529 ,⇒ z=1 . 05
Chapter 13 - Simple Linear Regression Analysis
LO9
13.63 The residual plot has somewhat of a cyclical appearance. Since d =.473 is less than dL, 05
= 1.27, we conclude there is positive autocorrelation and since 4 – .473 = 3.527 and this is greater than dU,.05 = 1.45 we conclude that there is not negative autocorrelation.
LO9
13.64 The plot of the residuals shows no pattern to indicate a non-constant variance and are centered around 0.
LO9
13.65 a. ln yt = 2.07012 + 0.25688t
ln y16 = 2.07012 + 025688(16) = 6.1802
b. e6.1802 = 483.09
e5.9945 = 401.22
e6.3659 = 581.67
c. Growth rate = e0.25688 = 1.293
This means the growth rate is expected to be 29.3% per year.
LO9
13.66 a. Yes; see the plot in part c.
b.
c.
13-21
260
240
220
y
x
2.0 2.4 2.8 3.2
Chapter 13 - Simple Linear Regression Analysis
d. p–value = .000, reject , significant
e.
LO1, LO2, LO4, LO5
13.67 a. b1 = –6.4424 For every unit increase in width difference, the mean number of accidents are reduced by 6.4 per 100 million vehicles.
b. p-value = .000 Reject H0 at all levels of
c. r2 = .984 98.4% of the variation in accidents is explained by the width difference.
LO2, LO4, LO6
13.68 a. No
b. Possibly not; Don’t take up smoking
LO1, LO2
13-22
Chapter 13 - Simple Linear Regression Analysis
13.69 For aggressive stocks a 95% confidence interval for is [ . 0163±t. 025( . 003724 ) ]=[ . 0163±2. 365( . 003724 ) ]=[ . 00749 , .02512 ] where is based on 7 degrees of freedom.We are 95% confident that the effect of a one-month increase in the return length time for an
aggressive stock is to increase the mean value of the average estimate of by between .00749 and .02512.
For defensive stocks a 95% confidence interval for is [–0.00462 ± 2.365(.00084164)] = [–.00661, –.00263].
For neutral stocks a 95% confidence interval for is [.0087255 ± 2.365(.001538)] = [.005088, .01236].
LO2, LO4
13.70 a. Using Figure 13.42, there does seem to be a negative relationship between temperature and o-ring failure.
b. The temperature of 31 was outside the experimental region.
LO1, LO2
13.71 a. There is a relationship since F = 21.13 with a p-value of .0002.
b. b1 = 35.2877, [19.2202,51.3553]
LO4
Internet Exercise -- Answers will vary depending on when data was obtained.
13-23
Chapter 13 - Simple Linear Regression Analysis
13.72
The regression equation isGMAT = 184 + 141 GPA
Predictor Coef SE Coef T PConstant 184.27 84.63 2.18 0.034GPA 141.08 25.36 5.56 0.000
S = 21.50 R–Sq = 39.2% R–Sq(adj) = 37.9%
Analysis of Variance
Source DF SS MS F PRegression 1 14316 14316 30.96 0.000Residual Error 48 22197 462Total 49 36513
Predicted Values for New Observations
New Obs Fit SE Fit 95.0% CI 95.0% PI 1 678.06 5.16 ( 667.68, 688.45) ( 633.60, 722.53)
Values of Predictors for New Observations
New Obs GPA1 3.50
LO1, LO2, LO6, LO8
13-24