How Chicom of them.
No way that Big Tech Apple would spy on their customers.
Their money is made on their products. Right?
These Apple “executives” need to be arrested for sedition and treason. Put that in your little data base, you pu$$ified geeks. We see you.
There are days I think not having a cell-phone is one of the smartest decisions I’ve ever made.
Good thing i have zero products from them.
Tuesday September 18, 2018 1:48 PM PDT by Juli Clover
With the release of iOS 12, tvOS 12, and watchOS 5 yesterday, Apple made some quiet changes to its iTunes and App Store privacy policy on iOS devices and the Apple TV.
Newly updated language in the iTunes and App Store privacy policy states that Apple is using a new device trust score to help identify and cut down on fraud.
Apple says that information about how you use your device, including the approximate number of phone calls or emails you receive is used to compute the device trust score when you make a purchase.
To help identify and prevent fraud, information about how you use your device, including the approximate number of phone calls or emails you send and receive, will be used to compute a device trust score when you attempt a purchase. The submissions are designed so Apple cannot learn the real values on your device. The scores are stored for a fixed time on our servers.
This warning shows up on both iOS devices and the Apple TV, which cant send emails or make phone calls, so it appears to be blanket wording Apple is using for all of its iTunes and App Store privacy updates.
Apple has always been committed to protecting users from fraud, and the trust score is a new anti-fraud technique introduced in iOS 12. Like many of Apples data collection practices, the trust score has been designed with user privacy in mind.
Data used to calculate the trust score is on-device and related to usage patterns rather than the content of communications (Apple wont know who you called or emailed or what you talked about), and when sent to Apple, the trust score is encrypted and stored for a short period of time.
Apple does not receive information beyond the score itself because the data used to determine the trust score is stored on device, as previously mentioned. A single trust score number Apple uses actually contains data from thousands of accounts, which protects your individual data and prevents Apple from seeing a single users device usage patterns.
Apple says the new iTunes and App Store trust scores are used solely to identify and prevent fraud and have no other purpose.
https://www.macrumors.com/2018/09/18/apple-device-trust-score-itunes-app-store/
I would put my trust in Apple at -5.
Are you cool with this crap?
Hey — your guys meet expectations!!!
Love love love!!!! :)
I have a trust score for that caddilac commie sodomite Tim Cook:
LESS THAN ZERO.
(Excerpt) Read more at planetfreewill.com ...
***********
Why?
Admit it. You didn’t write anything over there. You copied and pasted everything there from two other sources, then teased a paragraph over here as “clickbait” to drive traffic to your blog.
Nice try, newb.
If you want to share it, we can share it here.
*************
Apple Is Now Giving People Trust Scores Based on Their Calls and Emails
September 20, 2018
SHARE
By: Mac Slavo | SHTFplan
This information will likely not be a surprise to anyone who has been paying attention to Big Techs increasing propensity to violate the privacy of users and use their data for questionable reasons, but here we are.
See the source image
Two days ago, the tech website Venture Beat noticed an eyebrow-raising bit in the latest update to Apples privacy policy:
Apples promise of transparency regarding user data means that any new privacy policy update might reveal that its doing something new and weird with your data. Alongside yesterdays releases of iOS 12, tvOS 12, and watchOS 5, Apple quietly updated some of its iTunes Store terms and privacy disclosures, including one standout provision: Its now using an abstracted summary of your phone calls or emails as an anti-fraud measure.
The provision, which appears in the iTunes Store & Privacy windows of iOS and tvOS devices, says:
To help identify and prevent fraud, information about how you use your device, including the approximate number of phone calls or emails you send and receive, will be used to compute a device trust score when you attempt a purchase. The submissions are designed so Apple cannot learn the real values on your device. The scores are stored for a fixed time on our servers.
Venture Beat points out that this provision is unusual, in part because it includes Apple TVs, which do not have the capability to make calls or send emails.
It is unclear how Apple is going to collect the data, and
Its equally unclear how recording and tracking the number of calls or emails traversing a users iPhone, iPad, or iPod touch would better enable Apple to verify a devices identity than just checking its unique device identifier. Every one of these devices has both hardcoded serial numbers and advertising identifiers, while iPhones and cellular iPads also have SIM cards with other device-specific codes.
An Apple spokesperson contacted Venture Beat and confirmed that the device trust score is an update included in iOS 12. It was designed to detect fraud in iTunes purchases, as well as to reduce false positives in fraud detection, the rep said. It apparently gives Apple a better likelihood of accurately determining whether content is being bought by the actual named purchaser, Venture Beat explains.
Apple claims that they are still going to protect user data and privacy, and said details of calls and emails will not be collected. The data will be kept for a limited period.
We already know that Big Tech giants like Facebook and Google collect and store massive amounts of user data. If you still use Facebook and would like to see what kind of information they have on you, heres how to get that data. And, if you once used Facebook but have since deleted your account, dont get too comfortable the social media giant can still track you.
Speaking of Facebook, in August it was revealed that the social media platform rates users based on their trustworthiness, as reported by Engadget:
The companys Tessa Lyons has revealed to the Washington Post that its starting to assign users reputation scores on a zero-to-one scale. The system is meant to help Facebooks fight against fake news by flagging people who routinely make false claims against news outlets, whether its due to an ideological disagreement or a personal grudge. This isnt the only way Facebook gauges credibility, according to Lyons its just one of thousands of behavior markers Facebook is using.
What other criteria does Facebook measure to determine a users score? Do all users have a score? How are the scores used? These are questions that remain unanswered.
Facebook wont reveal exactly how it evaluates users, claiming that to do so might tip off bad actors who would then game the system.
As Engadget writer Violet Blue recently pointed out,
The company with the reputation for being the least trustworthy is rating the trustworthiness of its users. Wed be fools to think that it hasnt been doing this all along in other areas. Some animals are more equal than others. The thing is, Facebook long ago decided who was more trustworthy its real customers, its advertisers. It only pretended youd be more trustworthy if you gave them your ID.
People are abandoning social media platforms particularly, Facebook in record numbers, in part because so many privacy violations and data collection practices have been exposed.
Hopefully, this is not a sign that humanity is heading for a Communist China, Big Brother-style social credit rating system. Just two days ago, we reported that the Communist Partys plan in China is for every one of its 1.4 billion citizens to be at the whim of a dystopian social credit system, and its on track to be fully operational by the year 2020.
If you have watched the Netflix series Black Mirror in particular, the episode called Nosedive the increasing use of social credit rating systems and trust scores will seem eerily familiar.
If it isnt government spying on us, it is private companies. The surveillance line is becoming more and more blurred, as tech companies increasingly give user data (sometimes forced, sometimes not) to governments upon demand. Edward Snowden exposed the existence of PRISM, under which the National Security Agency (NSA) accesses emails, documents, photographs and other sensitive users data stored by major companies. Documents leaked by Snowden revealed that Facebook, Google, Microsoft, Yahoo, PalTalk, AOL, Skype, YouTube and Apple give the NSA direct access to its users information.
Going entirely off-grid is looking more and more appealing.
VIAMac Slavo
SOURCESHTFplan
Isn’t this essentially what communist China is doing?
Hey, Swordmaker, this is the type of thing that I have been objecting to about Apple. They can promise not to share info all they want, but nothing is stopping a rogue employee or a change of executives or company policy. Personally, I have little or nothing to hide, but it doesn’t take much for a disgruntled Democrat in the woodpile to make a Federal case out of nothing.
On the other hand, what is the alternative for the total non-geek? Can’t think of any. Would rather have a root canal than go back to the pc platform.
If you want on or off the Mac Ping List, Freepmail me.
While youre BLOG PIMPING Newby, read post 25 and try to comprehend it. Or are you simply a mindless bot?
Isn’t that the best way to enslave someone? Get them to vie for some sort of reward as you interfere in their rights and privacy....ask any decent psychologist about whether or not this is a way to train people to be your pet rats. I think there’s way more of Apple in China than just manufacturing plants.....