Use the Mean Value Theorem to show that a car that travels 110 mi in 2 h must have had a speed of 55 miles per hour (mph) at least once during the 2 h.
Solution Let s= f(t) represent the distance s the car has traveled after t hours. Its average velocity during the time period from 0 to 2 h is Average velocity=f(2)−f(0)2−0=110−02−0=55 mph
Using the Mean Value Theorem, there is a time t0, 0<t0<2, at which f′(t0)=f(2)−f(0)2−0=55
That is, the car had a velocity of 55 mph at least once during the 2 hour period.