Exactly what I was thinking.
Now... the exact number is unknown, due to the fact that the vehicle wouldn’t really have instantaneous acceleration.
The ‘time’ might help calculate the delta of the acceleration but that would only help on the uphill side.
Time is given in both cases, as speed is defined by ‘distance and time’ (mph).
Assuming ideological (and impossible) conditions...
1 mile at 15 mph +
1 mile at x mph = (15+x)/2
(15+x)/2=y
Given y=30
x=45
So.... what are we missing ?
To travel at an average speed of 15 mph means you will take 4 minutes to travel each mile.
To travel at an average speed of 30 mph means you will take 2 minutes to travel each mile.
Therefore, to average 30 mph for the 2 mile trip you must travel the distance in 4 minutes.
If you travel the first mile of the trip at 15 mph it will take 4 minutes.
So at the completion of the first mile you have already used up all the time available to average 30 mph.
Therefore it is impossible to complete the trip in the necessary time to average 30 mph.
If you travel the second mile at 45 mph, as has been suggested, it does not give you an average of 30 mph.
At 45 mph it will take an additional 1 min. 20 seconds to travel the second mile.
(60/45 = 1.333
That means the total time for the 2 mile trip is now 5 minutes 20 seconds.
(4 minutes for the first mile plus 1 1/3 minutes for the second mile)
Therefore the average time per mile is half of that, or 2 minutes 40 sec. (2.666 minutes).
That equates to an average speed of 22.5 mph for the 2 mile trip.
(60 minutes / 2.6666 = 22.5)
It’s a trick question. There is no speed you can travel the second mile and end up with a 30mph average over the two mile course. It’s impossible since you took 4 minutes to do the first mile.
You must travel the entire two miles in exactly four minutes to average 30mph.
Instant acceleration is not needed as it was stated as the max average speed the car could do for one mile up the hill.
This means it attained a speed faster than 15mph at some point on the uphill leg.