Posted on 08/01/2006 10:41:56 AM PDT by ShadowAce
Oh no!! This is SERIES!!
And HUGH!!
I am stuned!
Check your FReepmail
And most of us here...8^)
Microsoft is only doing this because they feel they have to. I prefer the option of agreesive law enforcement, and arresting those who create exploit code for unpatched vulnerabilities, but right now law enforcement can't bear that load. They will one day, when the right person gets hacked.
I think society in general will have to be a little bit more serious in a proactive sense about computer security. When people understand the risks and dangers of computer crime better, perhaps we'll see a better effort made at actually prosecuting computer crime.
I think one reason that you don't see it now, is that for many people, even if they use computers regularly, a computer is an unknowable black box. The same is much the case today with automobiles, but we've been around them longer and society has a more developed sense about law and order surrounding them, and in fact had originally carried many of the traditions surrounding the horse and buggy era that preceeded the wide use of automobiles. Most people know to get the oil, tires, and brakes checked on a more or less regular basis, but haven't a clue about much more than that. There are exceptions to this, obviously or there wouldn't be mechanic or backyard tinkerers, but they are exceptions rather than the general rule.
Much the same thing could be said about people and computers today. The problem is, that not enough people have learned the computer equivalent of oil changes and break/tire checks. In the interconnected world we live in today, this puts everyone at risk. I am affected by the bozo with a cablemodem who's computer is p0wned by a hacker ring running out of russia that is using his computer to generate spam that I have to deal with, both on my mailservers, and in my inbox. Its the cyber-equivalent to the bozo driving down the street on bald tires and no breaks who is a direct physical danger to everyone around him. Perhaps a better analogy would be the guy driving down the street belching smoke that practically suffocates you if you are unfortunately to be driving behind him.
I suspect that eventually there will be laws and other regulations about computing whereby the user will be held liable to some degree for leaving his system wide open to attackers, in much the same way that you can be fined for leaving the keys in your ignition in many Amerian jurisdictions. There is a specific legal term for this, that escapes me at the moment, (I'm sure someone will remember for me.), but it is similar in a way to the concept of 'enticement'.
I think it is going to take a while for us to catch up to some of he new threats and responsibilities that come from being a networked, computing society.
Check!
I agree, with the exception of our military who may do such things against foreign adversaries, at the time of war or in response to hack attempts made against us.
But doing internal security research and then saying 'hey I found out there is this big error in IE7' should never be illegal. I would not go about it quite that way (I would always give the vendor a heads up but if the problem is not addressed I would feel obligated to let the public know)
Finding the holes shouldn't be illegal, but reporting them publicly without first notifying the vendor, or even worse releasing exploit code prior to the vendor having time to develop a patch, should be.
Absolutely, we're not there yet. But as the quantity of crimes continues to rise, so will the quantity of prosecutions.
I am affected by the bozo with a cablemodem who's computer is p0wned by a hacker ring running out of russia that is using his computer to generate spam that I have to deal with, both on my mailservers, and in my inbox. Its the cyber-equivalent to the bozo driving down the street on bald tires and no breaks who is a direct physical danger to everyone around him.
Excellent analogy.
I suspect that eventually there will be laws and other regulations about computing whereby the user will be held liable to some degree for leaving his system wide open to attackers, in much the same way that you can be fined for leaving the keys in your ignition in many Amerian jurisdictions.
Possible, but the charges against someone leaving their keys in their car won't be as severe as the charges against a person who may have stolen the car.
I think it is going to take a while for us to catch up to some of he new threats and responsibilities that come from being a networked, computing society.
It'll turn out like everything else. Want something good, reliable, and safe? You'll have to open your wallet, or be such an expert you can build and maintain it yourself, which is getting harder and harder even with cars.
I would say this should not be done, but not that it should be illegal to do otherwise. Setting this up sets a precedent where a company can produce an unsafe product and when the defect is found the public is not made aware of the problem *or* how to protect themselves from harm.
Imagine if I found a serious defect in a baby car seat model and could not make it public without the OK of the manufacturer? sure my butt would be covered but that would be little comfort to someone who lost a baby in the time it took me to report it to the company and the company decided whether is was worth it to recall or take the risk of being sued.
Why should a computer system be any different? Computers run hospitals, banks, and medical research facilities. Were not just talking about the risk of losing money when a computer defect causes a problem we could be talking about lives!
Because, the fault in the baby seat cannot be used by criminals to steal from or destroy others, purposefully. The baby seat requires an arbitrary accident to occur, but doesn't invite others to crash into the car to invoke it, whereas the disclosure of a vulnerability or hack does encourage those who look for such things to plan to use them immediately on unsuspecting innocents.
And the non disclosure of such a hack keeps people from knowing there system is open for attack when that hack may already be know by thousands of criminals! Its pure stupidity to assume because I one does not tell the general public about a vulnerability the hacker community wont know about it. Hell, if I was a black hat hacker I would *not* want the bug disclosed to the general public. The longer that only the vendor knows about it the longer I have a victim set completely unaware their house has no front door!
The argument that when we make something illegal criminals wont have it is the same argument gun grabbers use. But the truth is when we take away information about system vulnerabilities from people *only* criminals will have it!
LOL, no it's obviously more ignorant to claim the hackers will know about it anyway so you might as well tell them. It's the same argument we hear from those that want us to share all our nuclear secrets with China and Russia. "Why not, they'll get them anyway by stealing, might as well just go ahead and give em to them." Wait, you say the same thing about open source software, don't you.
The argument that when we make something illegal criminals wont have it is the same argument gun grabbers use.
Another failed analogy of yours, with you talking in circles now. Guns are a completely usable finished product, used legally for many authorized purposes. Vulnerabilities aren't guns, they're like a blueprint for making a gun. The only equivalent to guns in this discussion are exploits, but you just said above that there were no good uses of exploits, remember, so why would anyone but criminals have a need for them.
Guns, like information, can be used for both legal and illegal purposes, both attack and defense. You attempt to ignore the percent of people who use the illegally to break an apt analogy is pretty transparent. An IT administrator can use that information to protect against the attack of someone else who has it, leaving IT administrators without information is like leaving them unarmed..
You want to help the hackers build an arms race on both sides. A better solution is to annihilate the hackers, and you don't do that by feeding them vulnerabilities. You feed them patches, and catch anyone who cracks it.
The problem with that is you inherently trust the vendor. Several companies have been known to sit on information that shows their product to have been cracked, and until the information went public, they did nothing about it.
Part of releasing the vulnerability information is to force the hand of the vendor to act. I agree that the vendor should be notified first. However, if nothing is done about it, then the information should be released publicly to force their action.
Then we agree. I think 6 months is sufficient time for vendors to respond. After that only official groups such as CERT should be notified, and anything else considered criminally negligent. But keep in mind this puts you at odds with open source leaders like Linus Torvalds who believe in what they call "full disclosure", meaning let the hackers and everyone know asap.
I (and others here) have always claimed that we don't all follow "OSS leaders'" beliefs. Until now, you have refused to believe that.
I expect you to remember this newfound belief when you start looking for topics to smear us with.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.