Posted on 08/07/2021 12:40:35 PM PDT by BenLurkin
On the surface Apple’s new features sound both sensible and commendable – but they also open a Pandora’s box of privacy and surveillance issues.
Don’t worry, the tech company has assured everyone, the prying is for purely benevolent purposes. On Thursday Apple announced a new set of “protection for children” features that will look through US iPhones for images of child abuse. One of these features is a tool called neuralMatch, which will scan photo libraries to see if they contain anything that matches a database of known child abuse imagery. Another feature, which parents can enable or disable, scans iMessage images sent or received by accounts that belong to a minor. It will then notify the parents when a child receives sexually explicit imagery.
Apple’s attempts to protect children may be valiant, they also open a Pandora’s box of privacy and surveillance issues. Of particular concern to security researchers and privacy activists is the fact that this new feature doesn’t just look at images stored on the cloud; it scans users’ devices without their consent. Essentially that means there’s now a sort of “backdoor” into an individual’s iPhone, one which has the potential to grow wider and wider. The Electronic Frontier Foundation (EFF), an online civil liberties advocacy group, warns that “all it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content … That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.” You can imagine, for example, how certain countries might pressure Apple to scan for anti-government messages...
(Excerpt) Read more at theguardian.com ...
“ it scans users’ devices without their consent. ”
They’re covered once you press the accept button during iOS install.
Does pedo joe have an I-phone?
Probably ought to check the Qtard phones first.
They're a pretty sketchy lot.
Put a MAGA hat on a kid and see what happens.
In the fine print does Apple still own the iPhone that customers use? Is everyone just renting them? Call me old fashioned, but I don’t think they should be allowed to search something not owned by them.
“all it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content … That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”
Governments will exploit this capability for all manner of nefarious purposes. Scan for any image, any person, any set of words...on a billion distributed sensors.
It’s the Chinese Fascist State enabler.
To “save the children”.
Expect checking for images containing guns will be next. Or MAGA hats.
Hunter Biden is unconcerned by this news.
The first person that came to mind!
What they will consider “child abuse,” will be above and beyond what most people consider abuse (sexual, physical, emotional).
The truth is that Apple is scanning photos uploaded to its cloud service for signs of child abuse images, not pictures on your phone. Its their cloud and their right do to this. Why anybody would store anything on their cloud is beyond me, but they are not scanning images on your phone.
Searching for child porn ON their services is fine.
I’ve got a real problem with them scanning on your phone and then notifying the police.
It’s VERY slippery slope type of stuff.
First it’s were just scanning photos on your phone before you upload them to our service looking for illicit images and notifying the police.
Then it’s - were just scanning photos on your phone for any illicit photos and notifying the police.
Then it’s were just scanning your phone for illicit files and notifying the police.
Then it’s were just scanning your computer for illicit files and notifying the police.
Stopping child abuse is good. But it opens the door for all sorts of big brother operations and 24/7 monitoring of your electronic data and I’m a hard NO on that.
That said - anything actually ON their servers is (and always was) fair game.
But this isn’t about stopping child abuse - it’s about further enacting a totalitarian state.
I think you own the phone. but they own the software.
My thinking is that a smartphone that is entirely controlled by the communist Chinese will not betray my smartphone's data directly to the US government and that my data is useless to the ChiComs for spying purposes. Why pay Apple to spy on me on a iPhone that they do not make themselves?
Strange plans for strange times on Prison Planet.
The problem is they’re scanning on your phone before upload.
It’s a very simple code change to change that to “we’re just scanning your photos in general for illicit pictures even without uploading.”
They don’t even have to do that voluntarily - government will immediately enact laws because Apple has shown it’s not only possible but feasible.
or diesel pickup trucks.
large homes
it could go on and on.
oddly... your right.
Except I wouldn’t do any banking on it.
Good post!
They can still have fun with you, though. Remember when Pokémon was having people walk off of cliffs? Or, the new COVID lockdown “ping” apps will make you hide in a closet.
It’s all fair game in cyber warfare. :)
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.