It works out that compression ratio is the main parameter that determines efficiency. There are gains that can be realised by making sure ALL the induced fuel gets burned (more turbulance, residence time, etc.) but this only goes so far. In order to increase the efficiency of any thermodynamic cycle, heat has to be added AT THE HIGHEST TEMPERATURE of the cycle.
Any hydrocarbon fuel has about 20,000btu/lb, whether it's natural gas or crude oil, so that's all we have to work with. With higher compression ratio, higher combustion temperatures are accomplished and benefit efficiency right up to the metallurgical and mechanical limit of the engine.
So a diesel engine with it's 21:1 ratio (it wont even run with less than about a 16:1 ratio) will always be more efficient.
Higher pressures also produce more nitric oxides and thus are not allowed leading to:
more unburned hydrocarbons and carbon monoxide in the exhaust leading to:
a post drive train burner (catalytic convertor) needing:
more expensive gasoline without lead, sulphur, phosphorous leading to:
more crude oil per gallon of gas.
The additional discharge of heat into the manifold leads to heavier radiators needing heavier engines. I'm guessing that the cost of complying with the nitrogen oxide regulations costs about 35% in efficiency, not to mention that photo-reactive partially burned hydrocarbons are still increased.