Posted on 07/19/2024 9:15:43 AM PDT by dhs12345
Vanity: Massive IT Outage Hits Major U.S. Hospital Chain: Microsoft Crowdstrike Glitch Forces Cancellation of All Elective Surgeries and Medical Procedures....
Yup. Good point.
But that was minor in comparison. The bug didn’t crash millions of computers.
It was vulnerability that was easy to patch. People didn’t have to perform CPR on their computers.
This IS the final validation step. Going live is the final validation.
Heck they might have installed it on a Windows PC. It’s so easy to get false positives in your controlled test environments. Your team puts some “helping” apps on the test machines and it turns out your software actually relies on them. A hardcoded path to something on your network. Heck even a “proper” clean machine, then you’re not getting potential conflicts from junk like Norton.
Especially in the hurry up and ship constant update model. With razor thin staffing. Probably your QA department says they need 6 months to do a proper regression, but you slam out updates every couple weeks.
It’s not like this is new. I think this is the 3rd or 4th major internet crash this year. Only way it stops is if the business world moves from demanding fast updates to stable updates. Then updates won’t come out more than once a year and real testing can happen. But it also means found problems stick around for months.
And they snoop your computers too.
Wonder if Rob Braxman will have something to say about it.
https://www.youtube.com/channel/UCYVU6rModlGxvJbszCclGGw
“ Conspiracy Theory? Crazy?”
Stuff just happens .
Not everything is a conspiracy. Hard not to go down that route, but we will be in padded cells if we think everything else.
Deployments go wrong. This LEVEL of wrong is inexcusable. This is a miss by multiple entities through multiple gates. A lot of that happening as of late.
Just got my internet back this morning after a tornado came through here on Tuesday. Power was down for 24 hours which wasn’t such a big deal (more worried about the food in my freezer), but I couldn’t keep up with anything on my shitful iPhone that had only 2 bars all week. To come back online, and find this news just made my weekend. It was worth the wait.
Yup, been there done that. You'd have thought that they (and we) would have learned.
BTW, do not install this kind of software on your home PC... Norton, Mcafee, etc. Bad, very bad!
Sometimes I wonder if these gadgets are worth it.
Sounds like you need a power back up for your freezer. Although, I suspect that a freezer will last a few days or a week so as long as you don’t open the door and it is in a cool place.
Yeah, but worse because it happens at internet speed.
Unfortunately “learning” is only temporary. We cycle through this at my company all the time. The customers want fast fast fast so we’re putting out updates (for 5 supported versions) once a month. Then of course things get unstable so they want stable so we slow down the update cycle. But then they complain that bugs sit around for months and they want fast.
This is killing me, for sure. I am no spring chicken, and I went to bed after 1 AM and got paged again at 4 AM.
I am exhausted. With each year that goes by now, I feel more and more like a Luddite.
“Purportedly, the fix for the PC/crowdstrike BSOD is to reboot in safe mode and delete a specified crowdsrike file.”
I haven’t seen anywhere whether that is a true fix or a work around.
Is it a file that should not be there?
or is it a file that should be there but this particular version has a bug that causes the issue?
If this is just a work around then maybe crowdstrike is running with limited or no functionality if you remove that file.
True. A few years ago, I would have agreed with you.
But the desperation and of the zealotry of the left of these days make me wonder.
A conspiracy theory until it isn’t. Oh and a coincidence.
This might explain why I can’t get into my health insurance provider’s web site today.
Then again, the brute force solution is to boot into safemode and do a >manual< system restore to a date before the software was installed.
It probably has to be done individually — if your company has 50 computers, it has to be done 50x. If your company has 200 computers, 200x.
Yup.
I think that Microsoft will have to change their control-freak update methods. Pushing out the same update to millions of computers at once is a recipe for disaster. CrowdStrike made the initial mistake but the core of the problem is the mass updates all at once around the world. Big customers are going to demand it.
Our company too small to afford expensive software like Crowdstrike.
Everything is fine and still working. However, our customers systems are down.
Plus I heard that Amazon is affected. Not sure how much.
How about beta testing any updates with just a few customers?
Nah—that makes too much sense.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.