"Considering that most of the people who are supposed intellectual superiors (libs) make some of the most catastrophic decisions in the history of humanity, I'm not sure this singularity is a good idea.
But I'm just a neanderthal conservative.
Maybe instead I should be the first to welcome our singularity overlords..."
I agree with you. Unfortunately, I can see no way to stop this type of thing from happening. Consider this: pretty soon, military systems will start becoming too fast-acting for people to control. It's a little like programmed trading, if you're familiar with that. The computers play games with the stock market that no person can keep up with.
Programmed trading is constricted by law. But consider the military thing. Suppose we suspected that such a system, set up to defend us, was going awry and working against or self-interest. If we pulled the pulled the plug, we would be vulnerable to our enemies. (This scenario presumes we would have enemies on a similar technological level.)
Anyway, if you think about it we will be developing all sorts of dependencies on computers that we can't really reverse or get out of without tremendous costs. Just think of the biggest microprocessors. I'm not up to date on the numbers, but the last I heard they were up to one third of a billion transistors in a single IC chip. Obviously, no person can actually understand this design. Makes you think.
Maybe SI should be quarentined in space, or a separate human colony in space be established beforehand that can defend itself and destroy a rogue SI, should things go terribly wrong.