357
For Exercise 15.1, see page 344; for Exercise 15.2, see page 348; for Exercise 15.3, see page 352.
15.4 Least-squares. The least-squares regression line
(a) is the line that makes the sum of the vertical distances of the data points from the line as small as possible.
(b) is the line that makes the sum of the vertical distances of the data points from the line as large as possible.
(c) is the line that makes the sum of the squared vertical distances of the data points from the line as small as possible.
(d) is the line that makes the sum of the squared vertical distances of the data points from the line as large as possible.
15.5 Correlation. The quantity that tells us what fraction of the variation in the responses is explained by the straight-line tie between the response and explanatory variables is
(a) the correlation.
(b) the absolute value of the correlation.
(c) the square root of the correlation.
(d) the square of the correlation.
15.6 Extrapolation. Extrapolation, or prediction outside the range of the data, is risky
(a) because the pattern observed in the data may be different outside the range of the data.
(b) because correlation does not necessarily imply causation.
(c) unless the correlation is very close to 1.
(d) unless the square of the correlation is very close to 1.
15.7 Prediction. An observed relationship between two variables can be used for prediction
(a) as long as we know the relationship is due to direct causation.
(b) as long as the relationship is a straight-line relationship.
(c) as long as the patterns found in past data continue to hold true.
(d) in all of the above instances.
15.8 Causation. The best evidence that changes in one variable cause changes in another comes from
(a) randomized comparative experiments.
(b) data for which the square of the correlation is near 1.
(c) higher values of the explanatory variable are associated with stronger responses.
(d) a plausible theory for theory for causation.