If you travel at a speed of 30mph in the 1st mile, how fast do you have to go in the second mile so you have an average speed of 60mph over the 2 mile stretch?
So I'll post before looking at the responses.
This isn't that hard, I doubt it took Einstein an hour (proof of that?).
60 mph is a mile a minute. So to average 60 mph over 2 miles will take 2 minutes.
You traveled the first mile at 30 mph. Well, if 60 mph is a mile a minute, 30 mph is a mile in two minutes. You already used up all the time, you can't average 60 mph with only one mile to go (or one foot to go!). Can't be done, at any speed.
A similar problem is using mpg. Get 20 mpg on to a destination that is a run up a hill or against a head-wind, and 40 mpg on the way back does *not* give an average of 30 mpg. A "mpg" figure doesn't work that way, since the distances don't match the distance traveled on a gallon. You'd have to go 20 miles @ 20 mpg, and 40 miles @ 40 mpg to average 30 mpg. The distances are different.
It's also why going from a car with 20 mpg to one with 40 mpg is not the same fuel savings as going from 10 mpg to 20 mpg.
This leads people to make bad decisions.
Now, I hope I got all that right, or I'll be embarrassed! At least I avoided using affect and effect!
-ERD50