The end result of all this is that employers will ultimately stop providing health insurance to their employees. That's exactly how it should be. Companies never really "decided for themselves" whether or not to offer health insurance to their employees. They were given enormous tax incentives to do it, and now the game has changed. It's really that simple.
They were not given tax incentives to offer health insurance to employees.
During WWII the government installed wage controls (good old FDR) and companies could not attract new workers. So they started offering health insurance which wasn’t frozen by government. It was government causing the problem in the first place.