Meme Encyclopedia
Media
Editorials
More

Popular right now

Throwing Car Batteries Into the Ocean

Throwing Car Batteries Into the Ocean

Adam Downer

Adam Downer • 6 years ago

Italian Brainrot / AI Italian Animals image and meme examples.

Italian Brainrot Animals

Mateus Lima

Mateus Lima • about a month ago

Mr. Cool Ice

Mr. Cool Ice

Matt Schimkowitz

Matt Schimkowitz • 6 years ago

100 Men vs 1 Gorilla viral debate meme and image examples.

100 Men vs. 1 Gorilla

Owen Carry

Owen Carry • 4 days ago

Tung Tung Tung Sahur meme image examples.

Tung Tung Tung Sahur

Sakshi Rakshale

Sakshi Rakshale • about a month ago

Know Your Meme is the property of Literally Media ©2024 Literally Media. All Rights Reserved.

Apple Will Soon Scan Your iPhone Pictures For Evidence Of Child Sexual Abuse

Apple Will Soon Scan Your iPhone Pictures For Evidence Of Child Sexual Abuse
Apple Will Soon Scan Your iPhone Pictures For Evidence Of Child Sexual Abuse

1481 views
Published August 06, 2021

Published August 06, 2021

Apple will soon implement technology that will scan photos uploaded to the Cloud for evidence of child sexual abuse, according to a report from The Financial Times and cryptography professor Matthew Green.

The technology, which roll out on iOS 15.0 in the U.S., will reportedly use a particular hashing algorithm known as neuralMatch to scan an iPhone user's pictures for images that match known pictures of child abuse. If too many photos share a specific "perceptual hash," the user would be reported to Apple servers.

While this could potentially be a boon for preventing child abuse, the ethical and technological issues were apparent to many upon learning the news.


As Matthew Green explains, if there's a technology that can scan a phone for evidence of child abuse, that technology could be used to scan for anything on a phone deemed wrong by an authoritarian government.


Edward Snowden also shared an article that was critical of the new tech. In an FT article, security engineering professor Ross Anderson said, "It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops."


As for fears of images getting misread and a Cloud user getting flagged for innocuous images, Apple insists that the flagging process goes through manual review, meaning humans at Apple's servers will have to judge if the images flagged by neuralMatch indeed constitute child abuse.


Comments ( 9 )

Sorry, but you must activate your account to post a comment.
    Meme Encyclopedia
    Media
    Editorials
    More