Use the Mean Value Theorem to show that a car that travels \(110\) mi in \(2\) h must have had a speed of \(55\) miles per hour (mph) at least once during the 2 h.
Using the Mean Value Theorem, there is a time \(t_{0}\), \(0<t_{0}<2,\) at which \[ f^\prime (t_{0}) =\dfrac{f(2) -f( 0) }{2-0}=55 \]
That is, the car had a velocity of \(55\) mph at least once during the \(2\) hour period.