Its designed to launch a spacecraft-carrying rocket from high altitude, thereby saving considerable rocket fuel and money on each launch.
Im not doubting that its true, but I dont understand it, and would like to.
It seems like you need X amount of energy to get Y amount of weight into orbit. My layman brain isnt getting how flying partway alters that relationship.
Again I dont doubt its true but I would like to read more details about HOW its true. And how much is really saved.
Due to inertia and fighting its way out of the gravity well, a rocket will spend more fuel and energy going from sea level to 35000 feet than it will from 35000 to 100000 and beyond. This is because of (among other things) the rocket will be at its heaviest at sea level due to gravity. The rocket must also fight against much higher air pressure at sea level pushing it down. And of course it has to lift all the fuel it needs to launch more or less vertically.
If you lift the rocket to 35000 feet, you can generally toss the boosters and all of the first stage lift. You need less of everything to get to orbit from 35000 feet. This is why things like experimental spaceplanes were launched off high altitude bombers instead of rockets. Using a carrier plane is more efficient as it doesnt have to go vertically and fight perpendicular to gravity.