I stumbled across a test online and one of the math questions stumped me. Here it is:
In a race from point X to point Y and back, Jack averages 30 miles per hour to point Y and 10 miles per hour back to point X. Sandy averages 20 miles per hour in both directions. If Jack and Sandy begin the race at the same time, who will finish first?
My gut instinct answer was Jack and Sandy would tie, since it appeared to me that for both of them their average speed was 20 miles per hour for the whole race. (Jack's speed of 30mph and 10mph over the same distance gives an average of 20mph, which is the same average as Sandy's speed, right?) But, the correct answer is Sandy finishes first. Doing some basic math it's easy to see why. Assume the race is 20 miles from X to Y. In this case, Sandy completes the race in exactly 2 hours (one hour each way). But for Jack, just the second leg of the the race will take 2 hours (20 miles at 10mph), so he is obviously losing since after 2 hours Sandy is already finished.
So, I obviously calculated Jack's total average speed wrong. What I didn't take into account is MPH is computed over time, not distance, so you can't just take the two average speeds for Jack and add them together and divide by two, even though they are for the same distance. Instead, to compute his total average speed, you have to divide the total distance (40) by the total time the trip will take. In this case, it's 2 hours for the second leg and .667 hours for the first leg. So, his total time was 2.667 hours. 40 divided by 2.667 gives an average speed of 15 mph. So, in reality, his total average speed is a bit less than Sandy's average of 20 mph. Weird, huh?
Maybe I just shouldn't be thinking about such things at 1:00am.
And to think one of my majors in college was mathematics...