Apple is a moral company.
They just want to check on you to make sure you’re not raping kids.
Apple unveiled plans to scan US iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.
The tool designed to detect known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will only flag images that are already in the center’s database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes.
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.
It used to be that our rights were uninfringeable. Rights were considered sacred, as they composed the foundational order of society. Now, all they have to do is dangle the specter of child rape in front of the goyim, and they automatically surrender all of their rights.
And yes – traditional rights-oriented logic is that it would indeed be better to allow a child to be raped than to take everyone’s basic rights from them.
This logic is dynamic, but the core of it is that the rights are uninfringeable, in any context, regardless of what boogieman the government shows up with. Obviously, the underlying assertion is that a violation of the rights of the people is worse than any possible single crime that can be prevented by taking those rights.
Of course, in the age of the coronavirus, when people will surrender all of their rights to fight against a virus that doesn’t even exist, none of the centuries of philosophy on which the concept of freedom is built mean anything at all.
Obviously, this Apple spying will not be limited to child porn – nothing is ever limited to anything, and many Jews have already claimed that “Nazism” is on par with child porn. That is the entire purpose of rights – they have to be defended, regardless of circumstance.
As we saw with the censorship, no one wanted to defend me when I became the first victim of it. Not even Alex Jones was interested in taking up the fight, despite the obvious, looming fact that he would soon be next if they were able to get away with unpersoning me.
The irony is that there is zero chance that Apple actually has any interest in stopping child molestation.
Firstly, their own CEO is a homosexual.
I’ve heard all of the arguments that there is some difference between homosexuality and pederasty – none was even remotely convincing.
Remember when Apple refused to open a terrorist’s phone for the FBI, so the FBI had to go get a zero day exploit from the Israelis?
On the practical front, it must be said: Apple is no longer a viable company to do business with.
I would advise everyone to switch to Chinese electronics. Get a Chinese phone with no Google products installed, and buy a Chinese laptop running Linux.
Imagine that these people are looking you in the eye and saying: “we just want to check to make sure you’re not raping children.”
The grotesque fact of reality is that any nude image of any petite girl who is not endowed with significant breasts is going to be pulled up by these folks and looked at by a person. The person is most likely working from home – due to the deadly Delta variant – and will easily be able to save these images.
You’ll probably also get nabbed if you’re into midgets, dwarves, or Asians.
Shouldn’t we be allowed to vote on this?
Who watches the watchmen?
How is it even legal for this corporation to just go ahead and announce that they are going to be rummaging through your nude images, looking for child porn?
UPDATE:
I ran across a Twitter thread from the head of Facebook’s WhatsApp, which gives a little bit of perspective here. He actually quotes a statement from Apple that was released in 2016 when the FBI was demanding a backdoor to fight hajis and kiddie rapists.
Note that this is supposedly his personal opinion, rather than a statement from Facebook/WhatsApp. Also note that this is de facto a jab by Facebook at Apple, which follows punches from the other way.
There is no way Facebook cares more about privacy than Apple, and I’m not posting this to suggest that – I just think a professional from the scene gives context to how totally heinous it is for Apple to just up and announce they’re going to be scanning your photos to make sure you’re not buggering or fiddling the kiddies.
I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.
People have asked if we'll adopt this system for WhatsApp. The answer is no.
— Will Cathcart (@wcathcart) August 6, 2021
We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption. https://t.co/2KrtIvD2yn
— Will Cathcart (@wcathcart) August 6, 2021
Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven't shared with anyone. That's not privacy.
— Will Cathcart (@wcathcart) August 6, 2021
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
— Will Cathcart (@wcathcart) August 6, 2021
Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy?
— Will Cathcart (@wcathcart) August 6, 2021
There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.
— Will Cathcart (@wcathcart) August 6, 2021
…”it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.
— Will Cathcart (@wcathcart) August 6, 2021