OK. Let's say we did. Averaged 15 mph for 1 mile (uphill, right ?)
Now, let's say that on the downhill run, we average 45 mph.
That would take about 1.2 (as you say) minutes. The uphill part took 4.
OK. 1.2 +4 = 5.2 minutes. Which means exactly squat, and has nothing to do with Average miles per hour.
The Average mph for 1 mile was 15, for the second it was 45, and therefore the average MPH for a 2 mile course =(15+45) 60 divided by 2 miles gives An average of 30mph over a 2 mile course.
Nope. Average speed is total distance divided by total time. 2 miles/5.2 minutes = 23.077 miles per hour.
Wrong-o, wrong-o.
The average speed for NASCAR drivers isn't calculated by averaging their lap speeds, but by calculating their average speed according to elapsed time over distance covered.
It has ever been thus...