At one point on our trip from Fort Collins to Salt Lake City at Thanksgiving, I noted to my wife that we had gone exactly 80 miles in 90 minutes (so that we were averaging less than 60 miles per hour). At that point, we were on I-80 traveling on cruise control at the maximum legal speed limit of 75 mph.

How long would it take until we averaged exactly 60 mph from the beginning of the trip?

 


Solution to the Problem:

It would take 40 minutes or 2/3 hour until we would average 60 mph.
At that point, we would have traveled a total of 130 miles (80 miles plus an additional 50 miles) in a total of 2 1/6 hours (1.5 hours plus an additional 2/3 hour).

I solved this as a Rate-Time-Distance problem.
Let x = time traveled at 75 mph until we reach the 60 mph average.
Then set up a table:

Rate Time Distance
---- 1.5 hr. 80 miles
75 mph x hr. 75x miles
   60 mph      (1.5 + x) hr.      (80 + 75x) miles   

Now, since Rate x Time = Distance, set up the equation:
60 (1.5 + x) = 80 + 75x.
This yields 90 + 60x = 80 + 75x
Solving, we obtain x = 2/3 hour (or 40 minutes).


Correctly solved by:

1. James Alarie University of Michigan -- Flint
Flint, Michigan
2. Sagar Patel Brookstone School
Columbus, Georgia
3. David & Judy Dixon Bennettsville, South Carolina
4. Shaan Arora Brookstone School
Columbus, Georgia
5. ciuba@bellsouth.net ----------
6. Richard K. Johnson La Jolla, California
7. Tristan Collins Virginia Tech
Blacksburg, Virginia
8. Neal Amos Brookstone School
Columbus, Georgia
9. John Funk Ventura, California
10. Mr. Robb's Discrete Math Class John Handley High School
Winchester, Virginia