To travel at an average speed of 15 mph means you will take 4 minutes to travel each mile.
To travel at an average speed of 30 mph means you will take 2 minutes to travel each mile.
Therefore, to average 30 mph for the 2 mile trip you must travel the distance in 4 minutes.
If you travel the first mile of the trip at 15 mph it will take 4 minutes.
So at the completion of the first mile you have already used up all the time available to average 30 mph.
Therefore it is impossible to complete the trip in the necessary time to average 30 mph.
If you travel the second mile at 45 mph, as has been suggested, it does not give you an average of 30 mph.
At 45 mph it will take an additional 1 min. 20 seconds to travel the second mile.
(60/45 = 1.333
That means the total time for the 2 mile trip is now 5 minutes 20 seconds.
(4 minutes for the first mile plus 1 1/3 minutes for the second mile)
Therefore the average time per mile is half of that, or 2 minutes 40 sec. (2.666 minutes).
That equates to an average speed of 22.5 mph for the 2 mile trip.
(60 minutes / 2.6666 = 22.5)
There is no limitation on ‘time’ given in the ‘test question’.
OK. Let's say we did. Averaged 15 mph for 1 mile (uphill, right ?)
Now, let's say that on the downhill run, we average 45 mph.
That would take about 1.2 (as you say) minutes. The uphill part took 4.
OK. 1.2 +4 = 5.2 minutes. Which means exactly squat, and has nothing to do with Average miles per hour.
The Average mph for 1 mile was 15, for the second it was 45, and therefore the average MPH for a 2 mile course =(15+45) 60 divided by 2 miles gives An average of 30mph over a 2 mile course.