Originally Posted by Ronstar View Post
Ok. I take it this way. Driver traveled at 30 mph in the first mile. Let's assume that is the average.
So that means his/her speed at the end of the first mile is 60 mph since the starting speed is 0 mph.
Problem states that the driver averages 60 mph over 2 miles. Since the driver averaged 30 mph in the first mile, he/she needs to average 90 mph in the 2nd mile in order to average 60 mph for the whole 2 miles. (30 ave mph mile 1 + 90 mph ave mile 2) / 2.
Since the driver was driving 60 mph at the beginning of mile 2, the driver needs to get going 120 mph at the end of mile 2 to average 90 mph in the 2nd mile.
The answer is 120 mph.
I saw several references to a 2 minute limit in the replies, and therefore its impossible. I saw no 2 minute limitation or any time limitation in the OP original problem statement. The 1st sentence says "travel at a speed of 30mph". MPH is by far the most common way we measure a 'rate of speed'. Since it is measurable, we can do the math.
I see no reason to impose a 2 minute time limit in the problem statement.
I've read through this whole thread, and this above is the only real answer based on practical physical capabilities of a car/rocket/airplane/object/person/thing/X/etc. My answer would be 90 if an object could go from 30 to 90 instantaneously. However, I change my answer to 120 based on general physical properties.