Applying the Mean Value Theorem to Rectilinear Motion

Use the Mean Value Theorem to show that a car that travels \(110\) mi in \(2\) h must have had a speed of \(55\) miles per hour (mph) at least once during the 2 h.

Solution Let \(s=\) \(f(t)\) represent the distance \(s\) the car has traveled after \(t\) hours. Its average velocity during the time period from \(0\) to \(2\) h is \[ \hbox{Average velocity}=\frac{f(2)-f(0)}{2-0}=\frac{110-0}{2-0}=55\ {\rm mph} \]

Using the Mean Value Theorem, there is a time \(t_{0}\), \(0<t_{0}<2,\) at which \[ f^\prime (t_{0}) =\dfrac{f(2) -f( 0) }{2-0}=55 \]

That is, the car had a velocity of \(55\) mph at least once during the \(2\) hour period.