2021 Aug 5, 9:46am
275 views 28 comments
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicised more widely as soon as this week, they said.The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.
They will scan for spicy meme's next.
Privacy cannot be overridden by the need for safety, without probable cause.
Later, Child Endangerment. Law Enforcement (oh, you know, those helpful Child Protective Services types) will stop by and take away your child.
rocketjoe79 saysPrivacy cannot be overridden by the need for safety, without probable cause. Isn't there something in the 4th Amendment about that kind of thing?
Will it find the pictures of Tim Cook in butt chaps?
Just saw this on the news. So totalitarian. Time to get freedom phone I guess.
I don't do anything illegal
WookieMan saysI don't do anything illegalWhat if they make it illegal to leave your house unless you submit to the jab?
Too many high stakes districts where they'd lose in a landslide in 2022 and 2024.
Only if they fix the voting system. At this point I question the legitimacy of ALL politicians.
No practical way to enforce.
I'd agree we're in weird times, but it will never be legal to do so.
Laws don't matter anymore, like elections. We don't have rule of law anymore, nor democracy.
Grabs the photos and covertly sends them to Tim Cook and other pedos. After doing this for a few months, it turns the perp over to the police so they can say they're helping society.
ConclusionPerceptual hashes are messy. The simple fact that image data is reduced to a small number of bits leads to collisions and therefore false positives. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems.My company’s customers are slightly inconvenienced by the failures of perceptual hashes (we have a UI in place that lets them make manual corrections). But when it comes to CSAM detection and its failure potential, that’s a whole different ball game. Needless to say, I’m quite worried about this.