James heads out for a run. First he runs on a level road, then he comes to a hill and runs to the top. When he gets to the top of the hill, he turns around and he runs back exactly the way he came.
Now, on level ground, James can run at 8 miles an hour. Uphill, he can run at 6 miles an hour. And downhill, he can run at 12 miles an hour. When he gets back home, he notices that he had run for exactly two hours.
So the question is, how far did he run? And it appears that there is not enough information here to solve the problem, but there is!
Solution to the Problem:
The answer is 16 miles.
First, you must realize that the average speed for going up the hill and coming back down the same hill is 8 miles per hour. That makes the average speed for the entire trip 8 mph. Since James ran for two hours, the total distance that he ran was 16 miles.
The difficult part of this problem was understanding that the average speed for going up and down the hill is 8 mph. To help see this, choose any distance for the hill, say 6 miles. Then it takes James 1 hour to run up the hill (at 6 miles per hour). It would take him 1/2 hour to run back down (6 miles divided by 12 mph). Therefore, it took 1.5 hours to run 12 miles (up and down the hill), so the average speed is 12 divided by 1.5 = 8 mph.
Correctly solved by:
1. K. Sengupta | Calcutta, India |
2. John Funk | Ventura, California |
3. James Alarie | Flint, Michigan |
4. David & Judy Dixon | Bennettsville, South Carolina |
5. Richard K. Johnson | La Jolla, California |