Skip to comments.
Driverless cars: Google in the front seat, hanky panky in the back
Canada Free Press ^
| 05/12/16
| Dr. Klaus Kaiser
Posted on 05/12/2016 7:44:40 AM PDT by Sean_Anthony
I now Understand...
Whats so enticing about the (just-around-the corner) self-driving cars?
In view of my slightly advanced age, I may be forgiven to be a bit slow in learningbut now I understand: Its all about what happens in the back seat!
Special: Do You Like Entering Contests & Big Prize Sweepstakes? As the Globe and Mail reports, Kirk, of the Canadian Automated Vehicles Centre of Excellence, told the Canadian Press on Monday that
once computers are doing the driving, there will be a lot more sex in cars. Another (biased ?) pundit, Sergio Marchionne, is claimed to have stated that the new system
will be fundamental to delivering automotive technology solutions that ultimately have far-reaching consumer benefits.
TOPICS: Business/Economy; Government; Science; Society
KEYWORDS: driverlesscars; google; sex; technology
Navigation: use the links below to view more comments.
first previous 1-20, 21-40, 41-60, 61-80, 81 next last
To: discostu
You're citing an incident of police using OnStar to do exactly what I predict they will do with self-driving technology. If they are already using this slowing technology (which is not turning off the ignition, btw - that would indeed result in a very, very abrupt stop, while also cutting off power assist to both the steering and brakes), why on Earth would you believe that they would not use an enhanced technology for exactly the same purpose?
Funny how people arguing your side always have to insist the other person is having faith in government even though that has not appeared once in my posts. And Ive pointed that out to you before. So stop lying about what Im saying.
I have no desire to anger or offend you, and I apologize if it came across that way. I'm just engaging in a discussion about a topic that interests me - I'm actually architecting an IoT project now (not self-driving), and the entire IoT space is something I have high hopes for.
But projecting restraint on the part of government when it comes to emerging technology ignores what is already happening wrt to the government calling for back-doors to device security, StingRay devices, etc. Current reality does not bear out the belief that government will not over-reach on control - certainly not on remote-access technology. And you just cited an incident in which law enforcement is using technology to do exactly what I - and surely you, by now - know they will do with self-driving technology. The fact that they haven't done more with OnStar, BMW Driver Assist, etc. is due entirely to the fact that these systems currently don't provide the technical capability to do more. But that will change.
To: AnotherUnixGeek
Not exactly what you predicted. This involved the owner. OnStar will stop a car, they’ll even coordinate it with the cops, but not FOR the cops, they stop them for the owner. They figured out there’s a pretty massive liability to stop the car for any old somebody who claims to be a cop. And no, turning off the ignition does NOT result in an abrupt stop, it results in a coast. My car had a fuel injection problem that would result in it stalling out and then coasting to a stop, yes you lose power steering but not an abrupt stop. They aren’t using the current technology for that purpose, because OnStar isn’t that stupid, same reason they won’t get to with the self driving.
I’m not projecting restraint on the part of the government. I’m pointing out the massive liability the companies will wind up under if they open up any form of remote control, which means the government will never get the chance. I’m sure the government will WANT to overreach control, I’m sure they’ve tried to overreach on OnStar. But the government will never bring enough to bear to outweigh the legal liability. Again, look at the lawsuits happening because on some cars you can hack through the entertainment system and stop a car. That’s the lesson they’re learning from right now, and it will result in no remote control capabilities.
62
posted on
05/14/2016 2:51:07 PM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
This involved the owner. OnStar will stop a car, theyll even coordinate it with the cops, but not FOR the cops, they stop them for the owner.
Yes, but not without the cooperation of the police, who must confirm to OnStar that traffic conditions are safe for the slowdown and instruct OnStar on when the slowdown should begin (and will probably require a stolen vehicle report). This is already a process driven by law enforcement, though triggered by the owner. It's not much of a step to enable law enforcement to both trigger and carry out the remote procedure, in the interest of safety.
BTW, thanks for pointing out this slowdown feature - I don't have OnStar and am not up to date on the features.
They figured out theres a pretty massive liability to stop the car for any old somebody who claims to be a cop.
Police are already legally empowered to stop cars or re-route them as needed. All they would be given is the technical and legal ability to apply this power to self-driving cars. The liability concern can be addressed by the same law mandating that all car manufacturers provide their new self-driving cars with a secure channel for communication and remote operation by law enforcement with authentication for emergency situations.
it will result in no remote control capabilities.
Self-driving cars will have to be able to accept external signals and respond to them, because road conditions are dynamic and can't be programmed in. They will also have to be able to respond to law enforcement remote control simply because they will have to be able to respond to traffic officer instructions.
Imagine a self-driving car trying to navigate through San Francisco during a temporary event like Super Bowl week, when streets being used for Super Bowl festivities were blocked off by patrol officers waving cars onto alternate routes. No, the car wouldn't hit anyone or run into anything, but without remote over-ride from the officers, it would have no way to know it should not proceed directly into the middle of a temporarily blocked-off street. Nor could you rely on manual over-ride - self-driving car technology currently requires a human operator with a licence, but eventually it will not.
To: AnotherUnixGeek
No actually the owner alone can call OnStar and make the car stop. The only reason they coordinated with the police was so that the car would stop in a convenient place to arrest the carjacker. It’s a process driven by the owner, with occasional help from the police as applicable. It is a GIANT step to enable law enforcement to trigger it on their own, you can tell it’s a giant step because they haven’t taken it in the last twenty years.
Comparing your paranoid predictions to traffic routing is kind of pathetic. The liability concern can’t be addressed by any law. They’re gonna get hacked, and they’re gonna get sued, and that’s why they’re lobby like hell to make sure they don’t have to give that power. There’s a reason the folks making the big lobby push are Google, Volvo and Ford, not the fraternity of police.
No they won’t. They can MONITOR road conditions through GPS the same way they currently do and route adjust, but they will not accept external commands. Traffic officer instructions are already handled through visual signals.
Can your GPS handle Superbowl week? Then there you go. Remote override would be a terrible way to handle self driving cars in traffic control, too many cars, not enough cops, you can’t mass control the cars because they aren’t all in the same spot, and individual control would take forever. They’ll need to be able to read the waving flashlights just like people. Or in the first wave of design they’ll need to be able to say “too complicated, handle it human”. Really these situations you bring up actually prove why you’re wrong. Remote control is easily the worst solution.
64
posted on
05/14/2016 4:26:06 PM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
No actually the owner alone can call OnStar and make the car stop. The only reason they coordinated with the police was so that the car would stop in a convenient place to arrest the carjacker.
According to an
article by OnStar's public policy manager, you're wrong.
"Once law enforcement officers have established a clear line of sight with the stolen vehicle, an OnStar adviser will remotely flash the stolen vehicles lights to verify for authorities that they have the correct vehicle in their view. After law enforcement officials have determined that conditions are safe and that police backup is on hand if needed, they can request a slowdown."
"They" in that last sentence clearly being the police. Not the owner.
Comparing your paranoid predictions to traffic routing is kind of pathetic. The liability concern cant be addressed by any law.
You seem quite agitated and insist on resorting to insulting language, despite my strenuous efforts to keep this civil. Fine, I'll reply in kind - your second sentence is idiotic. Legal liability is defined by the law.
Theyre gonna get hacked
Undoubtedly.
and theyre gonna get sued
Maybe. Anyone can file a lawsuit, and any frivolous lawsuit can be dismissed if there is no liability under the law. Any law which mandates that auto manufacturers implement such systems will provide some immunity against liability for hacking, since even legislators are dimly aware that there is no such thing as perfect network security.
Theres a reason the folks making the big lobby push are Google, Volvo and Ford, not the fraternity of police.
And that reason is that the technology is far too new and primitive for law enforcement or government to start considering the possibilities. That will change - the future is not determined by today's OnStar.
No they wont. They can MONITOR road conditions through GPS the same way they currently do and route adjust, but they will not accept external commands. Traffic officer instructions are already handled through visual signals.
It would help if you quoted what exactly you're replying to. How are "traffic officer instructions already handled through visual signals" by self-driving cars? What are you talking about?
Can your GPS handle Superbowl week? Then there you go.
Nope, my GPS can't handle Super Bowl week. It's up to me as the human operator to handle that. But since true self-driving technology will, by definition, not be dependent on a human operator, that could very well not be a possibility for self-driving cars.
Remote override would be a terrible way to handle self driving cars in traffic control, too many cars, not enough cops
Without remote control, how would you handle self-driving cars in emergency or temporary traffic conditions? Would you always require human operators with drivers licenses to be in the car to take over? That's not what many people considering this issue envision.
you cant mass control the cars because they arent all in the same spot
Yes, you can. Provide network access to the car, let it receive traffic alert information and adjust it's route in real time.
and individual control would take forever.
If the car reaches the site of the traffic problem, yes it would take forever. A lot of that can be alleviated by remote alerts of traffic, allowing the car to adjust it's route well ahead of time, as I've already said.
Theyll need to be able to read the waving flashlights just like people.
LOL - do you really believe that this would be more practical or realistic than simply broadcasting re-routing instructions over a secure police channel from a transmitter held and programmed by the officer? What would prevent anyone with a flashlight from re-routing your car?
Or in the first wave of design theyll need to be able to say too complicated, handle it human.
Yes, we already know that, since we're currently in the first wave of design and they do indeed say "too complicated, handle it human". But I'm interested in where this technology is headed, not in what it's current limitations are. The future is not defined by whatever OnStar is currently capable of.
To: AnotherUnixGeek
How does anybody know a vehicle is stolen? Reported by the owner. So no that’s still a process started by the owner.
They ARE paranoid predictions, that’s not insulting.
No they WILL get sued. You admit they’ll get hacked, that WILL result in lawsuits, because it will result in loss of life and property. And they won’t be frivolous lawsuits because the hacking will have resulted in loss of life and property. There’s really nothing frivolous about suing a car company that lets random scriptkiddies take over your car.
No, that reason is the car companies have a much more lucrative vested interest in structuring the laws. And part of that structure they want is to NOT open the door to remote control. The past is shown with today’s OnStar and since that past is TWO DECADES LONG and all these same paranoid predictions were made about OnStar and NONE have come to pass it is certainly a solid indication of the likely course of the future.
No actually your GPS CAN handle Superbowl week, at least if you bother to keep it up to date. That’s part of the point of updates, getting road closures, reroutes and construction zones so you can route accordingly.
I would handle them the same way we handle person driven cars: cops out there directing traffic. Self driving cars will HAVE TO be able to handle that for them to truly be useful as a primary usage mode.
Route adjustment is GPS, no need for remote control on that, just live data updates which the car will respond to. Same as if you get a traffic alert on your phone’s navigation app.
It’s not only more practical, it’s absolutely necessary. Not every traffic situation has enough infrastructure around it to be taking over cars. On the other hand there WILL be cops there directing human driven traffic. So since the cars absolutely positively CANNOT go into mass production without being able to respond human traffic direction that makes it very practical. Your secure police channel transmitter presupposes enough spare cops to be hanging around taking over the cars, which you might or might not have depending on the size of the police force and the problem.
Where the technology is headed is to the cars being able to take all the same inputs as humans, process them faster, and never need to be remotely controlled. There’s too much overhead to remote control no matter how you design the system, and all that overhead would have to be in ADDITION to everything we do to provide people with input. Much better, smarter, more secure, and cheaper for everybody if the cars can just handle that input directly.
66
posted on
05/15/2016 9:28:00 AM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
Where the technology is headed is to the cars being able to take all the same inputs as humans, process them faster
Yes, under normal operation.
and never need to be remotely controlled.
Never need to? Agree. Will never be? Disagree. There will be remote over-ride of self-driving cars to bring them to a halt, to slow them down as needed, to re-route them or deny access to routes as needed, where "as needed" is determined by governmental traffic and environmental policy, and by law enforcement. We've already gone over the fact that the law enforcement part of this is already in it's first stages.
Viewing this as paranoia requires a powerful faith in the self-restraint of government and in the power of a tort system whose laws are defined by the same government. Neither faith is not supported by reality. Believing otherwise wanders into the realm of faith, and I can't argue with faith. We're not going to agree about this, we each have our own prediction, we each think the other is wrong, and there is no way to settle it definitively except by waiting to observe what happens in the future. We will see.
To: AnotherUnixGeek
Yeah never. If the car can respond to cops waving arms, bubble lights and all that there’s simply no need to ever remote control the car. All that stuff you list as will be will ACTUALLY be 100% unnecessary to do via remote. It will NEVER be needed. And, AGAIN, including such a possibility would be the STUPIDEST possible thing a car maker could do, because once it can be remote controlled it WILL be hacked, people WILL die because of the hacking, they WILL be sued, and they WILL lose. What you insist must happen in actuality CANNOT, because such a thing is actually the WORST design, which of course is why NONE of the companies working on self driving cars are working on that design. They already know it’s a terrible idea.
We’ve already seen. Twenty years of OnStar HAS proven you wrong. Whether you’re willing to admit it or not. The data is in, your prediction is wrong. The verdict is in. It has been settled definitely. Sorry you can’t see that.
68
posted on
05/15/2016 12:54:53 PM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: dfwgator
What could possibly go wrong?Nothing, according to the people who think "driverless" cars are the solution to our transportation problems.
Something like this could never happen. No, wait...
69
posted on
05/15/2016 12:58:21 PM PDT
by
Fresh Wind
(Hey now baby, get into my big black car, I just want to show you what my politics are.)
To: discostu
If the car can respond to cops waving arms, bubble lights and all that theres simply no need to ever remote control the car.
Such recognition and response is far more difficult to implement than simple remote command communication with the car - in fact, what you're describing is remote communication with the car, relying on visual input and interpretation. Far more error prone, far more prone to visual hacks by people trying to play games with the car. I'm sorry if you can't see that.
Twenty years of OnStar HAS proven you wrong.
All OnStar has proven is that the first stages of what I predict have already been implemented - the ability of law enforcement (not the owner) to remotely slow down a car on command. I have no idea why you think illustrating my prediction helps your view at all.
To: AnotherUnixGeek
It’ll be hard, but it’s absolutely necessary. You cannot have a self driving car on the road that doesn’t pull to the right when an emergency vehicle comes behind it. And you absolutely cannot have a self driving car on the road that can’t be directed by hand in traffic. Both those situations are too spontaneous to be consistently handled by any remote control method. And they are both non-starters, vehicles on the road that can’t handle those situations are too dangerous to the world. And there’s a MASSIVE difference between remote communication and remote control. It might be more prone to error than remote control, but it is absolutely necessary.
No actually OnStar proves you 100% wrong. OnStar only turns off cars when THE OWNER says so. Even your “example”, the owner reported the car stolen, and allowed OnStar to do what was necessary to recover it. OnStar shows that law enforcement will NOT be allowed to just take over cars, period. I didn’t illustrate your prediction, your own example showed I’m right.
71
posted on
05/15/2016 1:53:46 PM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
You cannot have a self driving car on the road that doesnt pull to the right when an emergency vehicle comes behind it. And you absolutely cannot have a self driving car on the road that cant be directed by hand in traffic. Both those situations are too spontaneous to be consistently handled by any remote control method.
Think about it - would you rather depend on a self-driving vehicle to get the same visual signals that a human being gets, over the unsecured EM band visible to the human eye and then perform the very difficult interpretive calculations needed to understand lights or human gestures the way a human being could? Or would you rather depend on encrypted commands transmitted over a secure channel by law enforcement or an emergency vehicle? And this doesn't even take into consideration that the latter would be far easier to implement, and far more secure. The visual EM spectrum that we use for vision is not secured in any way. Anyone can visually signal your car - this is where your concerns about hacking should apply.
What you propose mayeventually be possible - AI sophisticated enough to recognize all of the visual cues humans use will be necessary for robots to interact with a world built for humans - but it will not happen anytime soon and self-driving technology will not wait for it.
No actually OnStar proves you 100% wrong. OnStar only turns off cars when THE OWNER says so.
As the article written by an OnStar exec clearly states, law enforcement will determine if and when the car will be slowed, and they will make the call to execute the procedure. I'm sorry, but it's clearly written and you're wrong. The owner can report a vehicle stolen to initiate the process, but that's as far as the owner's involvement in the slowdown goes.
The remote slowdown feature is a recent OnStar feature, having been introduced in 2009. As more cars start incorporating remote communication to control car operations, at some point government and law enforcement will start calling for control of remote actions of cars. If a car is being used in the commission of a crime and it can be stopped remotely, do you really think law enforcement will not want the ability to do so or do you think they will want to wait to get the permission of the owner, who may well be the person committing the crime?
To: AnotherUnixGeek
Self driving vehicle completely using its cameras and sensors 100%, because, even though you erroneously put the word “unsecured” in there that CAN’T be hacked. And the remote controlled system WILL be hacked. Also it will continue to work when the cops are spread too thin, when the cops are having blue flu, or when the government has remembered it doesn’t actually like us. You can’t hack the visual spectrum.
It’s an eventuality we’ll have AI sophisticated enough to handle all that. Heck we’ve got the CPU power right now in your smartphone. Your smartphone has between 10 and 40 times as much raw computer power as Deep Blue. When it comes to brute force computing we are literally in the world of just throw power at it. Only hard part now is working the code.
As the article clearly states the car WAS REPORTED STOLEN BY THE OWNER. Thus getting the entire ball rolling. Initiating the process is what matters. The owner said “yes do what it takes to get my car back” as long as OnStar is not stopping cars without the permission of the owner my point stands unassailable. Hell it stands PROVEN.
73
posted on
05/15/2016 4:38:09 PM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
You cant hack the visual spectrum.
Since the visual spectrum is completely open and unsecured, there is zero need to hack it. Any camera capable of viewing human actions or emergency vehicle lights will see whatever is put in front of it, including any random stranger waving his hands. By contrast, secure network communications actually are pretty secure and millions of confidential transactions are kept safe by this means every single day - that's far better than zero security, which is what you propose, and which is completely at odds with your security concerns.
The only way such a system would work is if the computer in the car had the ability to perform the same filtering of visual data and apply the same decision making processes an adult human brain does. Nope, not going to happen any time soon. AI has made some progress in the last few decades, but replicating significant amounts of human brain functionality? Science fiction right now, and irrelevant to self-driving technology for the next few decades.
As the article clearly states the car WAS REPORTED STOLEN BY THE OWNER. Thus getting the entire ball rolling. Initiating the process is what matters.
The folks at OnStar say otherwise. The police will not using the slowing technology simply because someone reports a stolen car with a recent OnStar installation. That is entirely under the control and discretion of law enforcement, working through OnStar. One more time: OnStar states quite clearly that they will only apply the slowing procedure at the request of law enforcement. That's all there is to it.
This is a relatively new technology and doesn't exist on the overwhelming majority of cars - as those two factors change, police and government will inevitably seek to gain over-ride control of the technology to increase safety, improve the environment, ease traffic conditions, etc. The result will be an inevitable decrease in the personal mobility of car owners. Believing otherwise is a matter of unwarranted faith.
To: AnotherUnixGeek
Random stranger waving his hands isn’t an issue. Not if it’s programmed right. Basically for someone to “hack” a self driving car through visual input they’d have to imitate a cop well enough to fool a person, in which case you were hosed anyway. But there’s a lot of effort into that, far superior to any old random script kiddie hitting the right website.
The car will HAVE to commit a lot of filtering. That’s the real challenge of this whole thing, the fact is we take in a LOT of data while we drive and ignore almost all of it. You don’t realize how much of what you see and hear every second of driving you ignore, but sitting at a stoplight (because we don’t want to cause any accidents here) just look around and notice how much you can see and how little of what you see matters to the task of driving. Software wise this is a big data problem, the cars are going to be taking in half a GB a second of data, ignoring 90%, and responding to the rest. But it’s necessary. If they can’t spot the difference between some random douche waving his hands and a cop directing traffic they can’t drive, there’s a million other more frequent tasks they’ll fail at if they can’t filter that data. Just think about parking lot stop signs, how often do you see them a little out of skew, so you can actually see them from the road and could conceivably think they apply. Self driving cars will have to filter that too. It’s just a necessary part of the tech.
How do the police know the car was stolen? Sorry, logistically there’s no other way for the situation to have happened than to have been owner initiated.
75
posted on
05/16/2016 7:58:57 AM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
You dont realize how much of what you see and hear every second of driving you ignore, but sitting at a stoplight (because we dont want to cause any accidents here) just look around and notice how much you can see and how little of what you see matters to the task of driving.
This is true not only for driving, but for everything a sighted person does. Our brains constantly filter massive amounts of visual input and select for the most important shapes, movements and colors to consider. Then our brains must consciously apply a number of decision criteria to judge the authenticity of the input and choose whether or not to act in response to the input. Coming close to such flexibility and power in computing will happen with time, but not anytime soon. AI and even computing power simply aren't advancing that quickly.
It's far easier, far quicker to implement and much more secure in the near future to simply have a traffic officer transmit detouring or emergency commands on a secured traffic-cop channel which your car will understand and respond to. There is no need for filtering, since all input on the channel is relevant, the channel is secure so there is no need for the on-board computer to have the ability to decide whether or not the input should be followed. I agree that a car which has a powerful intelligence capable of making such decisions for itself would be ideal, but it's not realizable within the next 20 years - given the pace of AI development over the last 30 years, I doubt it will happen in the next 50. And self-driving isn't going to wait that long.
To: AnotherUnixGeek
I think it’ll be pretty soon. Computing power right now is basically free. As I pointed out, your basic smartphone has at least 10 times as much raw computing power as Deep Blue (the one that beat Kasparov), and some are closing in on 40 times as much. Really at this point we have CPU power to spare, it’s just figuring out what to do with it. And as the big data world takes leaps and bounds we’re figuring out what to do with it. We’re in a remarkable part of the computer age.
It’s not easier, quicker to implement, or more secure to have a traffic officer transmit detouring commands. That model pre-supposes you have spare cops hanging around, they can figure out which cars are smart cars, which cars they’re controlling, steer them safely while not being in the car. And as for security, forget about it, you’ve already admitted said system is guaranteed to be hacked. That model is a non-starter, it won’t work. Just imagine trying to direct traffic out of that Super Bowl you referred to, there’s 50,000 cars coming out of the stadium, a random percentage of them are self driving, they are scattered randomly through out the crowd. You’re already using up 100 cops to direct the human driven tactic. How many more cops are necessary for your system? How many tablets do they need to pull this off? How much training do they need to be able to safely drive these cars that they might not even be sure which one it is? And then what happens to such a system in a blackout where you just don’t have that many extra cops? Or 4 hours into the blackout when the celltowers have run out of juice? Or out in the middle of nowhere where Scalia died? It just plain will not work. It’s a bad system that can only function in ideal conditions, which of course if you have cops directing traffic you’ve already left the ideal condition world behind. Self driving cars able to read the situation the same way people do (only faster, not tired, and not drunk) is the ONLY method that can actually work.
It’s happen in the next 20 years. It’ll happen in the next 5. There WILL be a self driving car in the main market in 2020, and it will NOT be remote controlled. It’s just a decision matrix tree. And with our smartphones having 10 times as much computing power as it takes to beat a grandmaster, decision matrix trees ain’t that tough.
77
posted on
05/16/2016 1:04:01 PM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
Its happen in the next 20 years. Itll happen in the next 5. There WILL be a self driving car in the main market in 2020, and it will NOT be remote controlled. Its just a decision matrix tree. And with our smartphones having 10 times as much computing power as it takes to beat a grandmaster, decision matrix trees aint that tough.
Well, I assume we'll both be around in 5 years to check on our predictions. True elf-driving cars should become common-place in about 10 years and while the cars will have a large amount of on-board computing power, they will also have remote over-ride access, for use by law enforcement initially and later by governmental authorities. The beginnings are already there, as we've discussed re: OnStar.
The sophistication and power of computing ability and AI that your projection requires have implications for machine intelligence that go way beyond self-driving, and would put the event horizon of a technological singularity closer to now than I find credible from today's AI. The task of visualizing and recognizing a police officer and his gestures among a crowd of other people and other cars on busy street and deciding whether or not his gestures should be obeyed is vastly more complex than doing a search on a limited number of nodes for each move in a chess decision tree. Our machines won't be there for a while.
To: AnotherUnixGeek
I don’t think it’s as big a hill to climb. It’s really just objects. Look at some of the stuff happening in the world of facial recognition, big data processing has all the keys. And there won’t be remote override. Too risky. The first generation will be in luxury cars, another reason to avoid remote override, people plunking down $80G for a car aren’t going to want the security hole and will be able to afford the lawyers to prove it. In the end it really is just a limited number of nodes for each move, the hardest part of the data processing is weeding out the useless data.
79
posted on
05/16/2016 2:32:55 PM PDT
by
discostu
(Joan Crawford has risen from the grave)
To: discostu
I don't think this conversation can go much further, but I thank you for the discussion. We will have a chance to see which of us is right, and I'm looking forward to seeing how self-driving and IoT develop in the next decade.
BTW, I saw a couple of Google's self-driving cars tooling around Los Altos on San Antonio Road yesterday. Very cool.
Navigation: use the links below to view more comments.
first previous 1-20, 21-40, 41-60, 61-80, 81 next last
Disclaimer:
Opinions posted on Free Republic are those of the individual
posters and do not necessarily represent the opinion of Free Republic or its
management. All materials posted herein are protected by copyright law and the
exemption for fair use of copyrighted works.
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson